This particular podcast covers the material in Chapter 3 of my new book “Statistical Machine Learning: A unified framework” with expected publication date May 2020. In this episode we discuss Chapter 3 of my new book which discusses how to formally define machine learning algorithms. Briefly, a learning machine is viewed as a dynamical system that is minimizing an objective function. In addition, the knowledge structure of the learning machine is interpreted as a preference relation graph which is implicitly specified by the objective function. In addition, this week we include in our book review section a new book titled “The Practioner’s Guide to Graph Data” by Denise Gosnell and Matthias Broecheler. To find out more information visit the website: www.learningmachines101.com .
LM101-086: Ch8: How to Learn the Probability of Infinitely Many Outcomes
LM101-085:Ch7:How to Guarantee your Batch Learning Algorithm Converges
LM101-084: Ch6: How to Analyze the Behavior of Smart Dynamical Systems
LM101-083: Ch5: How to Use Calculus to Design Learning Machines
LM101-082: Ch4: How to Analyze and Design Linear Machines
LM101-080: Ch2: How to Represent Knowledge using Set Theory
LM101-079: Ch1: How to View Learning as Risk Minimization
LM101-078: Ch0: How to Become a Machine Learning Expert
LM101-077: How to Choose the Best Model using BIC
LM101-076: How to Choose the Best Model using AIC and GAIC
LM101-075: Can computers think? A Mathematician's Response (remix)
LM101-074: How to Represent Knowledge using Logical Rules (remix)
LM101-073: How to Build a Machine that Learns to Play Checkers (remix)
LM101-072: Welcome to the Big Artificial Intelligence Magic Show! (Remix of LM101-001 and LM101-002)
LM101-071: How to Model Common Sense Knowledge using First-Order Logic and Markov Logic Nets
LM101-070: How to Identify Facial Emotion Expressions in Images Using Stochastic Neighborhood Embedding
LM101-069: What Happened at the 2017 Neural Information Processing Systems Conference?
LM101-068: How to Design Automatic Learning Rate Selection for Gradient Descent Type Machine Learning Algorithms
LM101-067: How to use Expectation Maximization to Learn Constraint Satisfaction Solutions (Rerun)
Create your
podcast in
minutes
It is Free
Insight Story: Tech Trends Unpacked
Zero-Shot
Fast Forward by Tomorrow Unlocked: Tech past, tech future
Black Wolf Feed (Chapo Premium Feed Bootleg)
Bannon`s War Room