Today we’re joined by Markus Nagel, research scientist at Qualcomm AI Research, who helps us kick off our coverage of NeurIPS 2023. In our conversation with Markus, we cover his accepted papers at the conference, along with other work presented by Qualcomm AI Research scientists. Markus’ first paper, Quantizable Transformers: Removing Outliers by Helping Attention Heads Do Nothing, focuses on tackling activation quantization issues introduced by the attention mechanism and how to solve them. We also discuss Pruning vs Quantization: Which is Better?, which focuses on comparing the effectiveness of these two methods in achieving model weight compression. Additional papers discussed focus on topics like using scalarization in multitask and multidomain learning to improve training and inference, using diffusion models for a sequence of state models and actions, applying geometric algebra with equivariance to transformers, and applying a deductive verification of chain of thought reasoning performed by LLMs.
The complete show notes for this episode can be found at twimlai.com/go/663.
Hypergraphs, Simplicial Complexes and Graph Representations of Complex Systems with Tina Eliassi-Rad - #547
Deep Learning, Transformers, and the Consequences of Scale with Oriol Vinyals - #546
Optimization, Machine Learning and Intelligent Experimentation with Michael McCourt - #545
Jupyter and the Evolution of ML Tooling with Brian Granger - #544
Creating a Data-Driven Culture at ADP with Jack Berkowitz - #543
re:Invent Roundup 2021 with Bratin Saha - #542
Multi-modal Deep Learning for Complex Document Understanding with Doug Burdick - #541
Predictive Maintenance Using Deep Learning and Reliability Engineering with Shayan Mortazavi - #540
Building a Deep Tech Startup in NLP with Nasrin Mostafazadeh - #539
Models for Human-Robot Collaboration with Julie Shah - #538
Four Key Tools for Robust Enterprise NLP with Yunyao Li - #537
Machine Learning at GSK with Kim Branson - #536
The Benefit of Bottlenecks in Evolving Artificial Intelligence with David Ha - #535
Facebook Abandons Facial Recognition. Should Everyone Else Follow Suit? With Luke Stark - #534
Building Blocks of Machine Learning at LEGO with Francesc Joan Riera - #533
Exploring the FastAI Tooling Ecosystem with Hamel Husain - #532
Multi-task Learning for Melanoma Detection with Julianna Ianni - #531
House Hunters: Machine Learning at Redfin with Akshat Kaul - #530
Attacking Malware with Adversarial Machine Learning, w/ Edward Raff - #529
Learning to Ponder: Memory in Deep Neural Networks with Andrea Banino - #528
Create your
podcast in
minutes
It is Free
20/20
The Dropout
Ten Percent Happier with Dan Harris
World News Tonight with David Muir
NEJM This Week