More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity
Papers Read on AI

More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity

2022-07-14
Transformers have quickly shined in the computer vision world since the emergence of Vision Transformers (ViTs). The dominant role of convolutional neural networks (CNNs) seems to be challenged by increasingly effective transformer-based models. Very recently, a couple of advanced convolutional models strike back with large kernels motivated by the local but large attention mechanism, showing appealing performance and efficiency. We propose Sparse Large Kernel Network ( SLaK ), a pure CNN a...
View more
Comments (3)

More Episodes

All Episodes>>

Get this podcast on your phone, Free

Create Your Podcast In Minutes

  • Full-featured podcast site
  • Unlimited storage and bandwidth
  • Comprehensive podcast stats
  • Distribute to Apple Podcasts, Spotify, and more
  • Make money with your podcast
Get Started
It is Free