ICLR 2023 - Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning
AI Breakdown

ICLR 2023 - Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning

2023-08-08
In this episode we discuss Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning by Zeyuan Allen-Zhu, Yuanzhi Li. The paper explores how ensembles of deep learning models can improve test accuracy and be distilled into a single model using knowledge distillation. It presents a theoretical framework that shows how ensembles can enhance test accuracy when the data has a multi-view structure. The paper also highlights the presence of "dark knowledge"...
View more
Comments (3)

More Episodes

All Episodes>>

Get this podcast on your phone, Free

Create Your Podcast In Minutes

  • Full-featured podcast site
  • Unlimited storage and bandwidth
  • Comprehensive podcast stats
  • Distribute to Apple Podcasts, Spotify, and more
  • Make money with your podcast
Get Started
It is Free