Attention and Memory-Augmented Networks for Dual-View Sequential Learning

对偶(语法数字) 计算机科学 异步通信 深度学习 机器学习 循环神经网络 序列学习 记忆模型 光学(聚焦) 对象(语法) 人工智能 人工神经网络 共享内存 物理 艺术 文学类 光学 操作系统 计算机网络
作者
Yong He,Cheng Wang,Nan Li,Zhenyu Zeng
标识
DOI:10.1145/3394486.3403055
摘要

In recent years, sequential learning has been of great interest due to the advance of deep learning with applications in time-series forecasting, natural language processing, and speech recognition. Recurrent neural networks (RNNs) have achieved superior performance in single-view and synchronous multi-view sequential learning comparing to traditional machine learning models. However, the method remains less explored in asynchronous multi-view sequential learning, and the unalignment nature of multiple sequences poses a great challenge to learn the inter-view interactions. We develop an AMANet (Attention and Memory-Augmented Networks) architecture by integrating both attention and memory to solve asynchronous multi-view learning problem in general, and we focus on experiments in dual-view sequences in this paper. Self-attention and inter-attention are employed to capture intra-view interaction and inter-view interaction, respectively. History attention memory is designed to store the historical information of a specific object, which serves as local knowledge storage. Dynamic external memory is used to store global knowledge for each view. We evaluate our model in three tasks: medication recommendation from a patient's medical records, diagnosis-related group (DRG) classification from a hospital record, and invoice fraud detection through a company's taxation behaviors. The results demonstrate that our model outperforms all baselines and other state-of-the-art models in all tasks. Moreover, the ablation study of our model indicates that the inter-attention mechanism plays a key role in the model and it can boost the predictive power by effectively capturing the inter-view interactions from asynchronous views.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
lk754673712完成签到,获得积分20
刚刚
刚刚
Zrf完成签到,获得积分10
1秒前
shilong.yang发布了新的文献求助10
4秒前
小黄完成签到,获得积分10
4秒前
7秒前
Zrf发布了新的文献求助10
7秒前
7秒前
javascript发布了新的文献求助10
13秒前
serafinaX完成签到,获得积分20
13秒前
秋雪瑶应助冯昊采纳,获得10
14秒前
llt完成签到,获得积分10
15秒前
15秒前
joyal发布了新的文献求助10
19秒前
medlive2020发布了新的文献求助10
21秒前
hug完成签到,获得积分10
22秒前
25秒前
刻苦羽毛发布了新的文献求助30
26秒前
传奇3应助serafinaX采纳,获得10
27秒前
vagabond发布了新的文献求助10
27秒前
27秒前
feng发布了新的文献求助10
29秒前
30秒前
31秒前
34秒前
幽默尔蓉发布了新的文献求助10
34秒前
冯昊发布了新的文献求助10
36秒前
36秒前
英俊的铭应助aidiresi采纳,获得10
38秒前
丸子头嘟嘟嘟完成签到,获得积分10
38秒前
39秒前
四季糖果发布了新的文献求助10
40秒前
这小猪真帅完成签到,获得积分10
40秒前
41秒前
42秒前
伍中道完成签到,获得积分10
42秒前
李健应助苟小兵采纳,获得10
43秒前
希望天下0贩的0应助chcmuer采纳,获得10
43秒前
彭于晏应助代骜珺采纳,获得10
45秒前
丘比特应助风来枫去采纳,获得10
47秒前
高分求助中
【本贴是提醒信息,请勿应助】请在求助之前详细阅读求助说明!!!! 20000
One Man Talking: Selected Essays of Shao Xunmei, 1929–1939 1000
The Three Stars Each: The Astrolabes and Related Texts 900
Yuwu Song, Biographical Dictionary of the People's Republic of China 800
Multifunctional Agriculture, A New Paradigm for European Agriculture and Rural Development 600
Challenges, Strategies, and Resiliency in Disaster and Risk Management 500
Bernd Ziesemer - Maos deutscher Topagent: Wie China die Bundesrepublik eroberte 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2481682
求助须知:如何正确求助?哪些是违规求助? 2144277
关于积分的说明 5469424
捐赠科研通 1866803
什么是DOI,文献DOI怎么找? 927830
版权声明 563039
科研通“疑难数据库(出版商)”最低求助积分说明 496404