标题 |
![]() 基于注意的神经机器翻译的有效方法
相关领域
机器翻译
计算机科学
人工智能
判决
任务(项目管理)
布鲁
德国的
自然语言处理
辍学(神经网络)
人工神经网络
翻译(生物学)
词(群论)
机制(生物学)
机器学习
深度学习
循环神经网络
语言学
生物化学
管理
化学
经济
基因
哲学
认识论
信使核糖核酸
|
网址 |
求助人暂未提供
|
DOI |
暂未提供,该求助的时间将会延长,查看原因?
|
其它 | An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation. However, there has been little work exploring useful architectures for attention-based NMT. This paper examines two simple and effective classes of attentional mechanism: a global approach which always attends to all source words and a local one that only looks at a subset of source words at a time. We demonstrate the effectiveness of both approaches over the WMT translation tasks between English and German in both directions. With local attention, we achieve a significant gain of 5.0 BLEU points over non-attentional systems which already incorporate known techniques such as dropout. |
求助人 | |
下载 | 该求助完结已超 24 小时,文件已从服务器自动删除,无法下载。 |
温馨提示:该文献已被科研通 学术中心 收录,前往查看
科研通『学术中心』是文献索引库,收集文献的基本信息(如标题、摘要、期刊、作者、被引量等),不提供下载功能。如需下载文献全文,请通过文献求助获取。
|