In the last episode How to master optimisation in deep learning I explained some of the most challenging tasks of deep learning and some methodologies and algorithms to improve the speed of convergence of a minimisation method for deep learning.
I explored the family of gradient descent methods - even though not exhaustively - giving a list of approaches that deep learning researchers are considering for different scenarios. Every method has its own benefits and drawbacks, pretty much depending on the type of data, and data sparsity. But there is one method that seems to be, at least empirically, the best approach so far.
Feel free to listen to the previous episode, share it, re-broadcast or just download for your commute.
In this episode I would like to continue that conversation about some additional strategies for optimising gradient descent in deep learning and introduce you to some tricks that might come useful when your neural network stops learning from data or when the learning process becomes so slow that it really seems it reached a plateau even by feeding in fresh data.
How to generate very large images with GANs (Ep. 76)
[RB] Complex video analysis made easy with Videoflow (Ep. 75)
[RB] Validate neural networks without data with Dr. Charles Martin (Ep. 74)
How to cluster tabular data with Markov Clustering (Ep. 73)
Waterfall or Agile? The best methodology for AI and machine learning (Ep. 72)
Training neural networks faster without GPU (Ep. 71)
Validate neural networks without data with Dr. Charles Martin (Ep. 70)
Complex video analysis made easy with Videoflow (Ep. 69)
Episode 68: AI and the future of banking with Chris Skinner [RB]
Episode 67: Classic Computer Science Problems in Python
Episode 66: More intelligent machines with self-supervised learning
Episode 65: AI knows biology. Or does it?
Episode 64: Get the best shot at NLP sentiment analysis
Episode 63: Financial time series and machine learning
Episode 62: AI and the future of banking with Chris Skinner
Episode 61: The 4 best use cases of entropy in machine learning
Episode 60: Predicting your mouse click (and a crash course in deeplearning)
Episode 59: How to fool a smart camera with deep learning
Episode 58: There is physics in deep learning!
Episode 57: Neural networks with infinite layers
Create your
podcast in
minutes
It is Free
Insight Story: Tech Trends Unpacked
Zero-Shot
Fast Forward by Tomorrow Unlocked: Tech past, tech future
Black Wolf Feed (Chapo Premium Feed Bootleg)
Bannon`s War Room