机器翻译
计算机科学
抽象
背景(考古学)
人工智能
编码器
翻译(生物学)
基线(sea)
上下文模型
分层数据库模型
自然语言处理
人工神经网络
机器学习
数据挖掘
生物化学
化学
信使核糖核酸
基因
古生物学
哲学
海洋学
认识论
对象(语法)
生物
地质学
操作系统
作者
Lesly Miculicich,Dhananjay Ram,Nikolaos Pappas,James Henderson
摘要
Neural Machine Translation (NMT) can be improved by including document-level contextual information. For this purpose, we propose a hierarchical attention model to capture the context in a structured and dynamic manner. The model is integrated in the original NMT architecture as another level of abstraction, conditioning on the NMT model's own previous hidden states. Experiments show that hierarchical attention significantly improves the BLEU score over a strong NMT baseline with the state-of-the-art in context-aware methods, and that both the encoder and decoder benefit from context in complementary ways.
科研通智能强力驱动
Strongly Powered by AbleSci AI