BioGPT: Generative Pre-trained Transformer for Biomedical Text Generation and Mining
Papers Read on AI

BioGPT: Generative Pre-trained Transformer for Biomedical Text Generation and Mining

2023-02-20
In this paper, we propose BioGPT, a domain-specific generative Transformer language model pre-trained on large-scale biomedical literature. We evaluate BioGPT on six biomedical natural language processing tasks and demonstrate that our model outperforms previous models on most tasks. Our case study on text generation further demonstrates the advantage of BioGPT on biomedical literature to generate fluent descriptions for biomedical terms. 2022: Renqian Luo, Liai Sun, Yingce Xia, Tao Qin, Sheng Zhang,...
View more
Comments (3)

More Episodes

All Episodes>>

Get this podcast on your phone, Free

Create Your Podcast In Minutes

  • Full-featured podcast site
  • Unlimited storage and bandwidth
  • Comprehensive podcast stats
  • Distribute to Apple Podcasts, Spotify, and more
  • Make money with your podcast
Get Started
It is Free