Liquid Time-constant Networks
RoFormer: Enhanced Transformer with Rotary Position Embedding
LTC-SE: Expanding the Potential of Liquid Time-Constant Neural Networks for Scalable AI and Embedded Systems
Flacuna: Unleashing the Problem Solving Power of Vicuna using FLAN Fine-Tuning
LongNet: Scaling Transformers to 1,000,000,000 Tokens
Focused Transformer: Contrastive Training for Context Scaling
A Survey on Evaluation of Large Language Models
Voice Conversion With Just Nearest Neighbors
Large Language Model as Attributed Training Data Generator: A Tale of Diversity and Bias
Enhancing Chat Language Models by Scaling High-quality Instructional Conversations
Towards Language Models That Can See: Computer Vision Through the LENS of Natural Language
Augmenting Language Models with Long-Term Memory
LeanDojo: Theorem Proving with Retrieval-Augmented Language Models
ChatLaw: Open-Source Legal Large Language Model with Integrated External Knowledge Bases
MotionGPT: Human Motion as a Foreign Language
FinGPT: Open-Source Financial Large Language Models
A Survey on Multimodal Large Language Models
Fast Segment Anything
A Simple and Effective Pruning Approach for Large Language Models
On the Benefits of 3D Pose and Tracking for Human Action Recognition
Join Podbean Ads Marketplace and connect with engaged listeners.
Advertise Today
Create your
podcast in
minutes
It is Free
Cyber Security Headlines
The WAN Show
Babbage from The Economist
Software Engineering Daily
Cybersecurity Today