We show that while state-of-the-art transformer models perform very well for small databases, they exhibit limitations in processing noisy data, numerical operations, and queries that aggregate facts. We propose a modular architecture to answer these database-style queries over multiple spans from text and aggregating these at scale. Our architecture scales to databases containing thousands of facts whereas contemporary models are limited by how many facts can be encoded.
2021: James Thorne,...
We show that while state-of-the-art transformer models perform very well for small databases, they exhibit limitations in processing noisy data, numerical operations, and queries that aggregate facts. We propose a modular architecture to answer these database-style queries over multiple spans from text and aggregating these at scale. Our architecture scales to databases containing thousands of facts whereas contemporary models are limited by how many facts can be encoded.
2021: James Thorne, Majid Yazdani, Marzieh Saeidi, F. Silvestri, Sebastian Riedel, A. Halevy
https://arxiv.org/pdf/2106.01074v1.pdf
View more