References

Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks

Patrick Lewis, Ethan Perez, Aleksandra Piktus, Fabio Petroni, Vladimir Karpukhin, Naman Goyal, Heinrich Küttler, Mike Lewis, Wen-tau Yih, Tim Rocktäschel, Sebastian Riedel, & Douwe Kiela (2020)

arXiv.

DOI: https://doi.org/10.48550/arxiv.2005.11401

Abstract. Introduces Retrieval-Augmented Generation (RAG), which augments a generative language model with a non-parametric memory retrieved from a document store. RAG grounds model outputs in external evidence, reducing hallucination and enabling knowledge updates without retraining.

Tags: rag retrieval language-models

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).