Back to Search Start Over

CItruS: Chunked Instruction-aware State Eviction for Long Sequence Modeling

Authors :
Bai, Yu
Zou, Xiyuan
Huang, Heyan
Chen, Sanxing
Rondeau, Marc-Antoine
Gao, Yang
Cheung, Jackie Chi Kit
Publication Year :
2024

Abstract

Long sequence modeling has gained broad interest as large language models (LLMs) continue to advance. Recent research has identified that a large portion of hidden states within the key-value caches of Transformer models can be discarded (also termed evicted) without affecting the perplexity performance in generating long sequences. However, we show that these methods, despite preserving perplexity performance, often drop information that is important for solving downstream tasks, a problem which we call information neglect. To address this issue, we introduce Chunked Instruction-aware State Eviction (CItruS), a novel modeling technique that integrates the attention preferences useful for a downstream task into the eviction process of hidden states. In addition, we design a method for chunked sequence processing to further improve efficiency. Our training-free method exhibits superior performance on long sequence comprehension and retrieval tasks over several strong baselines under the same memory budget, while preserving language modeling perplexity.<br />Comment: Work in progress

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.12018
Document Type :
Working Paper