Activate deep learning neurons faster with Dynamic RELU (ep. 101)
Data Science at Home

Activate deep learning neurons faster with Dynamic RELU (ep. 101)

2020-04-01
In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU). While there are several flavors of ReLU in the literature, in this episode I speak about a very interesting approach that keeps computational complexity low while improving performance quite consistently. This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website. Don't forget to join us on discord ...
View more
Comments (3)

More Episodes

All Episodes>>

Get this podcast on your phone, Free

Create Your Podcast In Minutes

  • Full-featured podcast site
  • Unlimited storage and bandwidth
  • Comprehensive podcast stats
  • Distribute to Apple Podcasts, Spotify, and more
  • Make money with your podcast
Get Started
It is Free