Practical AI: Machine Learning, Data Science
Technology
Large Language Models (LLMs) continue to amaze us with their capabilities. However, the utilization of LLMs in production AI applications requires the integration of private data. Join us as we have a captivating conversation with Jerry Liu from LlamaIndex, where he provides valuable insights into the process of data ingestion, indexing, and query specifically tailored for LLM applications. Delving into the topic, we uncover different query patterns and venture beyond the realm of vector databases.
Leave us a comment
Changelog++ members save 1 minute on this episode because they made the ads disappear. Join today!
Sponsors:
Featuring:
Show Notes:
Something missing or broken? PRs welcome!
Timestamps:
(00:00) - Welcome to Practical AI
(00:43) - Jerry Liu
(04:28) - LlamaIndex
(08:12) - What do I get?
(11:23) - More power less work
(13:26) - Fitting the pieces together
(16:49) - 3 Levels of integrating
(19:13) - How to think about indexing
(21:44) - Defining embedding
(23:09) - Index vs Vector storage
(25:07) - Alternatives
(30:41) - Query scheme workflow
(35:22) - Evaluating responses
(40:03) - Awesome new stuff
(43:23) - Links and show notes
(44:05) - Outro
AI in the U.S. Congress
First impressions of GPT-4o
Full-stack approach for effective AI agents
Autonomous fighter jets?!
Private, open source chat UIs
Mamba & Jamba
Udio & the age of multi-modal AI
RAG continues to rise
Should kids still learn to code?
AI vs software devs
Prompting the future
Generating the future of art & entertainment
YOLOv9: Computer vision is alive and well
Representation Engineering (Activation Hacking)
Leading the charge on AI in National Security
Gemini vs OpenAI
Data synthesis for SOTA LLMs
Large Action Models (LAMs) & Rabbits 🐇
Collaboration & evaluation for LLM apps
Advent of GenAI Hackathon recap
Create your
podcast in
minutes
It is Free
Podcast – Kitchen Sink WordPress
The Goat Farm
Away From The Keyboard
Arrested DevOps
Build Phase