Practical AI: Machine Learning, Data Science
Technology
Large Language Models (LLMs) continue to amaze us with their capabilities. However, the utilization of LLMs in production AI applications requires the integration of private data. Join us as we have a captivating conversation with Jerry Liu from LlamaIndex, where he provides valuable insights into the process of data ingestion, indexing, and query specifically tailored for LLM applications. Delving into the topic, we uncover different query patterns and venture beyond the realm of vector databases.
Leave us a comment
Changelog++ members save 1 minute on this episode because they made the ads disappear. Join today!
Sponsors:
Featuring:
Show Notes:
Something missing or broken? PRs welcome!
Timestamps:
(00:00) - Welcome to Practical AI
(00:43) - Jerry Liu
(04:28) - LlamaIndex
(08:12) - What do I get?
(11:23) - More power less work
(13:26) - Fitting the pieces together
(16:49) - 3 Levels of integrating
(19:13) - How to think about indexing
(21:44) - Defining embedding
(23:09) - Index vs Vector storage
(25:07) - Alternatives
(30:41) - Query scheme workflow
(35:22) - Evaluating responses
(40:03) - Awesome new stuff
(43:23) - Links and show notes
(44:05) - Outro
Create your
podcast in
minutes
It is Free