Episode 11.25: MoE (Mixture of Experts) approaches to AI are already appearing. What they involve.
Unmaking Sense

Episode 11.25: MoE (Mixture of Experts) approaches to AI are already appearing. What they involve.

2023-12-10
The AI wars are hotting up with much current disparagement of OpenAI, competition from Google Deepmind’s Gemini, and now a new version of the Mistral model based on MoE technology called “Mixtral” because it mixes experts. The pace of development is extraordinary and that before the AIs themselves get in on the act.
Comments (3)

More Episodes

All Episodes>>

Get this podcast on your phone, Free

Create Your Podcast In Minutes

  • Full-featured podcast site
  • Unlimited storage and bandwidth
  • Comprehensive podcast stats
  • Distribute to Apple Podcasts, Spotify, and more
  • Make money with your podcast
Get Started
It is Free