ML Developer, Software Architect, JS Engineer, Ultra-distance cyclist
How Attention was created? Why does it work and why it is one of the most important things in ML right now?
Visualization of Positional Encoding method from Transformer models.