Generative large language models are all-purpose text analytics engines: text-to-text learning is all your need

计算机科学 生成语法 人工智能 变压器 自然语言处理 关系抽取 推论 机器学习 生成模型 语言模型 规范化(社会学) 信息抽取 物理 量子力学 电压 社会学 人类学
作者
Peng Cheng,Xi Yang,Aokun Chen,Zehao Yu,Kaleb E Smith,Anthony Costa,Mona G. Flores,Jiang Bian,Yonghui Wu
出处
期刊:Journal of the American Medical Informatics Association [Oxford University Press]
卷期号:31 (9): 1892-1903 被引量:4
标识
DOI:10.1093/jamia/ocae078
摘要

Abstract Objective To solve major clinical natural language processing (NLP) tasks using a unified text-to-text learning architecture based on a generative large language model (LLM) via prompt tuning. Methods We formulated 7 key clinical NLP tasks as text-to-text learning and solved them using one unified generative clinical LLM, GatorTronGPT, developed using GPT-3 architecture and trained with up to 20 billion parameters. We adopted soft prompts (ie, trainable vectors) with frozen LLM, where the LLM parameters were not updated (ie, frozen) and only the vectors of soft prompts were updated, known as prompt tuning. We added additional soft prompts as a prefix to the input layer, which were optimized during the prompt tuning. We evaluated the proposed method using 7 clinical NLP tasks and compared them with previous task-specific solutions based on Transformer models. Results and Conclusion The proposed approach achieved state-of-the-art performance for 5 out of 7 major clinical NLP tasks using one unified generative LLM. Our approach outperformed previous task-specific transformer models by ∼3% for concept extraction and 7% for relation extraction applied to social determinants of health, 3.4% for clinical concept normalization, 3.4%-10% for clinical abbreviation disambiguation, and 5.5%-9% for natural language inference. Our approach also outperformed a previously developed prompt-based machine reading comprehension (MRC) model, GatorTron-MRC, for clinical concept and relation extraction. The proposed approach can deliver the “one model for all” promise from training to deployment using a unified generative LLM.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
rose完成签到,获得积分10
刚刚
coconut完成签到 ,获得积分10
刚刚
2秒前
6秒前
7秒前
8秒前
合适依秋发布了新的文献求助10
8秒前
gougoutu完成签到,获得积分10
9秒前
9秒前
10秒前
卑微老大发布了新的文献求助10
13秒前
干饭大王完成签到,获得积分10
13秒前
天天快乐应助yaya采纳,获得10
14秒前
916应助杨蔚达采纳,获得10
14秒前
gougoutu发布了新的文献求助10
15秒前
16秒前
loong发布了新的文献求助10
16秒前
豆子完成签到,获得积分10
16秒前
香蕉觅云应助shine采纳,获得10
19秒前
21秒前
ppg123应助听雨轩采纳,获得20
21秒前
张大大完成签到,获得积分10
21秒前
23秒前
豆子完成签到,获得积分0
24秒前
loong完成签到,获得积分20
26秒前
张大大发布了新的文献求助10
27秒前
27秒前
ding应助wangjie采纳,获得10
29秒前
29秒前
David完成签到,获得积分10
31秒前
华仔应助gougoutu采纳,获得10
34秒前
完美元柏发布了新的文献求助10
36秒前
misong发布了新的文献求助10
36秒前
FashionBoy应助weixiaozdw采纳,获得10
42秒前
苑小苑完成签到,获得积分10
42秒前
44秒前
handsomelin给handsomelin的求助进行了留言
44秒前
领导范儿应助小夜盲J采纳,获得10
44秒前
sophia完成签到 ,获得积分10
46秒前
46秒前
高分求助中
Picture Books with Same-sex Parented Families: Unintentional Censorship 1000
A new approach to the extrapolation of accelerated life test data 1000
ACSM’s Guidelines for Exercise Testing and Prescription, 12th edition 500
Nucleophilic substitution in azasydnone-modified dinitroanisoles 500
不知道标题是什么 500
Indomethacinのヒトにおける経皮吸収 400
Phylogenetic study of the order Polydesmida (Myriapoda: Diplopoda) 370
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3977850
求助须知:如何正确求助?哪些是违规求助? 3522015
关于积分的说明 11211196
捐赠科研通 3259254
什么是DOI,文献DOI怎么找? 1799573
邀请新用户注册赠送积分活动 878417
科研通“疑难数据库(出版商)”最低求助积分说明 806899