LinkBERT: Pretraining Language Models with Document Links
Papers Read on AI

LinkBERT: Pretraining Language Models with Document Links

2022-04-12
Language model (LM) pretraining can learn various knowledge from text corpora, helping downstream tasks. However, existing methods such as BERT model a single document, and do not capture dependencies or knowledge that span across documents. In this work, we propose LinkBERT, an LM pretraining method that leverages links between documents, e.g., hyperlinks. 2022: Michihiro Yasunaga, J. Leskovec, Percy Liang Ranked #1 on Text Classification on BLURB https://arxiv.org/pdf/2203.15827v1.pdf
Comments (3)

More Episodes

All Episodes>>

Get this podcast on your phone, Free

Creat Yourt Podcast In Minutes

  • Full-featured podcast site
  • Unlimited storage and bandwidth
  • Comprehensive podcast stats
  • Distribute to Apple Podcasts, Spotify, and more
  • Make money with your podcast
Get Started
It is Free