Back to Search Start Over

Everything is Editable: Extend Knowledge Editing to Unstructured Data in Large Language Models

Authors :
Deng, Jingcheng
Wei, Zihao
Pang, Liang
Ding, Hanxing
Shen, Huawei
Cheng, Xueqi
Publication Year :
2024

Abstract

Recent knowledge editing methods have primarily focused on modifying structured knowledge in large language models. However, this task setting overlooks the fact that a significant portion of real-world knowledge is stored in an unstructured format, characterized by long-form content, noise, and a complex yet comprehensive nature. Techniques like local layer key-value storage and term-driven optimization, as used in previous methods like MEMIT, are not effective for handling unstructured knowledge. To address these challenges, we propose a novel Unstructured Knowledge Editing method, namely UnKE, which extends previous assumptions in the layer dimension and token dimension. Firstly, in the layer dimension, we propose non-local block key-value storage to replace local layer key-value storage, increasing the representation ability of key-value pairs and incorporating attention layer knowledge. Secondly, in the token dimension, we replace term-driven optimization with cause-driven optimization, which edits the last token directly while preserving context, avoiding the need to locate terms and preventing the loss of context information. Results on newly proposed unstructured knowledge editing dataset (UnKEBench) and traditional structured datasets demonstrate that UnKE achieves remarkable performance, surpassing strong baselines. In addition, UnKE has robust batch editing and sequential editing capabilities.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.15349
Document Type :
Working Paper