Episode 8.90: Teaching GPT new stuff using embedding rather than fine-tuning.
Unmaking Sense

Episode 8.90: Teaching GPT new stuff using embedding rather than fine-tuning.

2023-05-02
Almost all the GPT models only use training data that goes up to September 2021. As you can scarcely fail to have noticed rather a lot has happened since then. How do you teach GPT new stuff and try to bring it up to date if you want to use it for something that happened after September 2021? The answer is you use a technique called search and ask that uses embeddings to compare the knowledge that you feedit with the questions that you ask it. Details are in the episode.
Comments (3)

More Episodes

All Episodes>>

Get this podcast on your phone, Free

Create Your Podcast In Minutes

  • Full-featured podcast site
  • Unlimited storage and bandwidth
  • Comprehensive podcast stats
  • Distribute to Apple Podcasts, Spotify, and more
  • Make money with your podcast
Get Started
It is Free