Episode 11.11: Low-cost ways to adapt and specialise trained models. LoRA - Low Rank Adaptation.
Unmaking Sense

Episode 11.11: Low-cost ways to adapt and specialise trained models. LoRA - Low Rank Adaptation.

2023-11-22
LoRA stands for “Low-Rank Adaptations of LLMs”; it is a technique that involves making low-cost tweaks to the weights of trained models by adding small adjustments.
Comments (3)

More Episodes

All Episodes>>

Get this podcast on your phone, Free

Create Your Podcast In Minutes

  • Full-featured podcast site
  • Unlimited storage and bandwidth
  • Comprehensive podcast stats
  • Distribute to Apple Podcasts, Spotify, and more
  • Make money with your podcast
Get Started
It is Free