A Comprehensive Study of Knowledge Editing for Large Language Models

计算机科学 相关性(法律) 分类 数据科学 水准点(测量) 知识管理 人工智能 大地测量学 政治学 法学 地理
作者
Ningyu Zhang,Yunzhi Yao,Bozhong Tian,Qianqian Wang,Shumin Deng,Mengru Wang,Zerong Xi,Shengyu Mao,Jintian Zhang,Yizhou Ni,Siyuan Cheng,Zhi‐Kang Xu,Xin-Shun Xu,Jia-Chen Gu,Ya Dong Jiang,Pengjun Xie,Fei Huang,Liang Liu,Zhiqiang Zhang,Xiaorong Zhu,Junhe Zhou,Huajun Chen
出处
期刊:Cornell University - arXiv
标识
DOI:10.48550/arxiv.2401.01286
摘要

Large Language Models (LLMs) have shown extraordinary capabilities in understanding and generating text that closely mirrors human communication. However, a primary limitation lies in the significant computational demands during training, arising from their extensive parameterization. This challenge is further intensified by the dynamic nature of the world, necessitating frequent updates to LLMs to correct outdated information or integrate new knowledge, thereby ensuring their continued relevance. Note that many applications demand continual model adjustments post-training to address deficiencies or undesirable behaviors. There is an increasing interest in efficient, lightweight methods for on-the-fly model modifications. To this end, recent years have seen a burgeoning in the techniques of knowledge editing for LLMs, which aim to efficiently modify LLMs' behaviors within specific domains while preserving overall performance across various inputs. In this paper, we first define the knowledge editing problem and then provide a comprehensive review of cutting-edge approaches. Drawing inspiration from educational and cognitive research theories, we propose a unified categorization criterion that classifies knowledge editing methods into three groups: resorting to external knowledge, merging knowledge into the model, and editing intrinsic knowledge. Furthermore, we introduce a new benchmark, KnowEdit, for a comprehensive empirical evaluation of representative knowledge editing approaches. Additionally, we provide an in-depth analysis of knowledge location, which can give a deeper understanding of the knowledge structures inherent within LLMs. Finally, we discuss several potential applications of knowledge editing, outlining its broad and impactful implications.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
善学以致用应助风清扬采纳,获得30
1秒前
ljmmhk完成签到,获得积分10
1秒前
Bobo发布了新的文献求助10
1秒前
大模型应助yu采纳,获得10
1秒前
科研通AI6.3应助凡酒权采纳,获得10
1秒前
1秒前
2秒前
3秒前
123完成签到 ,获得积分10
3秒前
fenkaidiefei关注了科研通微信公众号
4秒前
黄先生完成签到,获得积分20
4秒前
香蕉觅云应助源孤律醒采纳,获得10
4秒前
4秒前
我是老大应助wangqianyu采纳,获得30
4秒前
4秒前
orange完成签到 ,获得积分10
4秒前
bkagyin应助Bobo采纳,获得10
4秒前
5秒前
5秒前
彭于晏应助一一采纳,获得10
5秒前
ohno耶耶耶完成签到,获得积分10
6秒前
6秒前
6秒前
7秒前
7秒前
liu完成签到,获得积分10
7秒前
echo完成签到,获得积分10
7秒前
niu发布了新的文献求助10
8秒前
熊猫之歌完成签到,获得积分10
8秒前
8秒前
Eurus发布了新的文献求助10
8秒前
张颖完成签到,获得积分10
8秒前
共享精神应助xh采纳,获得10
8秒前
科研小Li发布了新的文献求助10
8秒前
binglangcha发布了新的文献求助10
9秒前
Bobo完成签到,获得积分10
9秒前
汉堡包应助echo采纳,获得10
10秒前
10秒前
11秒前
11秒前
高分求助中
Modern Epidemiology, Fourth Edition 5000
Kinesiophobia : a new view of chronic pain behavior 5000
Molecular Biology of Cancer: Mechanisms, Targets, and Therapeutics 3000
Propeller Design 2000
Weaponeering, Fourth Edition – Two Volume SET 2000
Handbook of pharmaceutical excipients, Ninth edition 1500
First commercial application of ELCRES™ HTV150A film in Nichicon capacitors for AC-DC inverters: SABIC at PCIM Europe 1000
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 纳米技术 化学工程 生物化学 物理 计算机科学 内科学 复合材料 催化作用 物理化学 光电子学 电极 冶金 细胞生物学 基因
热门帖子
关注 科研通微信公众号,转发送积分 6007472
求助须知:如何正确求助?哪些是违规求助? 7539992
关于积分的说明 16122767
捐赠科研通 5153505
什么是DOI,文献DOI怎么找? 2760773
邀请新用户注册赠送积分活动 1738526
关于科研通互助平台的介绍 1632619