Episode 8.92: Could embeddings ever replace neural nets and training?
Unmaking Sense

Episode 8.92: Could embeddings ever replace neural nets and training?

2023-05-03
We consider the possibility that, as embedding vectors get longer and longer - we have already got to 1536 floating-point numbers and there are reports that elements of GPT-4 uses over 12,000 - could embeddings ever come to encapsulate so much of the semantics of everything that they embed everything between them.m? They become absolutely unique and as such capable of replacing the weights and even the layers of their neural nets? It’s an interesting idea
Comments (3)

More Episodes

All Episodes>>

Get this podcast on your phone, Free

Creat Yourt Podcast In Minutes

  • Full-featured podcast site
  • Unlimited storage and bandwidth
  • Comprehensive podcast stats
  • Distribute to Apple Podcasts, Spotify, and more
  • Make money with your podcast
Get Started
It is Free