❤️ Become The AI Epiphany Patreon ❤️ ► [ Ссылка ]
In this video I cover a new paper coming from Microsoft: "Focal Self-attention for Local-Global Interactions in Vision Transformers" where they introduce a new transformer layer called focal attention.
The main idea is to reduce the complexity but preserve the long-range dependencies. They achieve this by attending to the nearby tokens in a fine-grained manner and to the tokens that are further away they attend their coarsened representations.
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
✅ Paper: [ Ссылка ]
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
⌚️ Timetable:
00:00 Main idea of the paper: focal self-attention
04:55 Overview of Focal Transformer architecture
08:15 Focal Self-Attention layer
12:30 Computational complexity, overlapping regions
15:30 SOTA results but with a disclaimer
17:30 Ablations
19:50 Outro, Focal Transformer is slower than Swin
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💰 BECOME A PATREON OF THE AI EPIPHANY ❤️
If these videos, GitHub projects, and blogs help you,
consider helping me out by supporting me on Patreon!
The AI Epiphany ► [ Ссылка ]
One-time donation:
[ Ссылка ]
Much love! ❤️
Huge thank you to these AI Epiphany patreons:
Eli Mahler
Petar Veličković
Zvonimir Sabljic
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💡 The AI Epiphany is a channel dedicated to simplifying the field of AI using creative visualizations and in general, a stronger focus on geometrical and visual intuition, rather than the algebraic and numerical "intuition".
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
👋 CONNECT WITH ME ON SOCIAL
LinkedIn ► [ Ссылка ]
Twitter ► [ Ссылка ]
Instagram ► [ Ссылка ]
Facebook ► [ Ссылка ]
👨👩👧👦 JOIN OUR DISCORD COMMUNITY:
Discord ► [ Ссылка ]
📢 SUBSCRIBE TO MY MONTHLY AI NEWSLETTER:
Substack ► [ Ссылка ]
💻 FOLLOW ME ON GITHUB FOR COOL PROJECTS:
GitHub ► [ Ссылка ]
📚 FOLLOW ME ON MEDIUM:
Medium ► [ Ссылка ]
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#focaltransformer #microsoft #transformer
![](https://i.ytimg.com/vi/YH319yyeoVw/maxresdefault.jpg)