Back to Search Start Over

A MapReduce Approach to Effectively Utilize Long Context Information in Retrieval Augmented Language Models

Authors :
Zhang, Gongbo
Xu, Zihan
Jin, Qiao
Chen, Fangyi
Fang, Yilu
Liu, Yi
Rousseau, Justin F.
Xu, Ziyang
Lu, Zhiyong
Weng, Chunhua
Peng, Yifan
Publication Year :
2024

Abstract

While holding great promise for improving and facilitating healthcare, large language models (LLMs) struggle to produce up-to-date responses on evolving topics due to outdated knowledge or hallucination. Retrieval-augmented generation (RAG) is a pivotal innovation that improves the accuracy and relevance of LLM responses by integrating LLMs with a search engine and external sources of knowledge. However, the quality of RAG responses can be largely impacted by the rank and density of key information in the retrieval results, such as the "lost-in-the-middle" problem. In this work, we aim to improve the robustness and reliability of the RAG workflow in the medical domain. Specifically, we propose a map-reduce strategy, BriefContext, to combat the "lost-in-the-middle" issue without modifying the model weights. We demonstrated the advantage of the workflow with various LLM backbones and on multiple QA datasets. This method promises to improve the safety and reliability of LLMs deployed in healthcare domains.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2412.15271
Document Type :
Working Paper