In this episode I speak about how important reproducible machine learning pipelines are.
When you are collaborating with diverse teams, several tasks will be distributed among different individuals. Everyone will have good reasons to change parts of your pipeline, leading to confusion and definitely a number of options that soon explode.
In all those cases, tracking data and code is extremely helpful to build models that are reproducible anytime, anywhere.
Listen to the podcast and learn how.
Episode 64: Get the best shot at NLP sentiment analysis
Episode 63: Financial time series and machine learning
Episode 62: AI and the future of banking with Chris Skinner
Episode 61: The 4 best use cases of entropy in machine learning
Episode 60: Predicting your mouse click (and a crash course in deeplearning)
Episode 59: How to fool a smart camera with deep learning
Episode 58: There is physics in deep learning!
Episode 57: Neural networks with infinite layers
Episode 56: The graph network
Episode 55: Beyond deep learning
Episode 53: Estimating uncertainty with neural networks
Episode 52: why do machine learning models fail? [RB]
Episode 51: Decentralized machine learning in the data marketplace (part 2)
Episode 50: Decentralized machine learning in the data marketplace
Episode 49: The promises of Artificial Intelligence
Episode 48: Coffee, Machine Learning and Blockchain
Episode 47: Are you ready for AI winter? [Rebroadcast]
Episode 46: why do machine learning models fail? (Part 2)
Episode 45: why do machine learning models fail?
Create your
podcast in
minutes
It is Free
Insight Story: Tech Trends Unpacked
Zero-Shot
Fast Forward by Tomorrow Unlocked: Tech past, tech future
Lex Fridman Podcast
The Unbelivable Truth - Series 1 - 26 including specials and pilot