Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks, Patrick Lewis, Ethan Perez, Aleksandra Piktus, Fabio Petroni, Vladimir Karpukhin, Narsimha Chatti, Mike Lewis, Kyunghyun Cho, Douwe van der Schaar, 2020Advances in Neural Information Processing Systems, Vol. 33 (NeurIPS)DOI: 10.55917/tgfr.33.805 - Introduces Retrieval-Augmented Generation (RAG), a foundational approach for LLMs to retrieve information from external knowledge bases before generating responses.
Self-RAG: Learning to Retrieve, Generate & Critique through Self-Reflection, Akari Asai, Zeqiu Wu, Yizhong Wang, Avirup Sil, Hannaneh Hajishirzi, 2023arXiv preprintDOI: 10.48550/arXiv.2310.11511 - Introduces Self-RAG, an approach where LLMs retrieve, generate, and self-critique retrieved passages and generated responses, enhancing quality and reducing hallucinations, relevant to iterative query refinement and synthesis.