Patrick Lewis, Ethan Perez, Aleksandra Piktus, Fabio Petroni, Vladimir Karpukhin, Naman Goyal, Heinrich Küttler, Mike Lewis, Wen-tau Yih, Tim Rocktäschel, Sebastian Riedel, & Douwe Kiela (2020)
arXiv.
DOI: https://doi.org/10.48550/arxiv.2005.11401
Abstract. Introduces Retrieval-Augmented Generation (RAG), which augments a generative language model with a non-parametric memory retrieved from a document store. RAG grounds model outputs in external evidence, reducing hallucination and enabling knowledge updates without retraining.
Tags: rag retrieval language-models