计算机科学
杠杆(统计)
实体链接
连贯性(哲学赌博策略)
知识库
背景(考古学)
人工智能
情报检索
量子力学
生物
物理
古生物学
作者
Jinliang Li,Liu Haoyu,Yulong Zhang,Li Zhang,Qiang Yang,Jianfeng Qu,Zhixu Li
标识
DOI:10.1007/978-3-030-90888-1_23
摘要
Entity linking aims at mapping the mentions in a document to their corresponding entities in a given knowledge base, which involves two continuous steps, i.e., local step which focuses on modeling the semantic meaning of the context around the mention, and global step which optimizes the refereed entities coherence in the document. Upon the existing great efforts on both steps, this paper would like to enhance both local and global entity linking models with several attention mechanisms respectively. Particularly, we propose to leverage self-attention mechanism and LSTM-based attention mechanism to better capture the inter-dependencies between tokens in the mention context for the local entity linking models. We also adopt a hierarchical attention network with a multi-head attention layer to better represent documents with one or multiple topics for the global entity linking models, which could help alleviate the side effect of error accumulation. Extensive empirical study on standard benchmarks proves the effectiveness of the proposed models.
科研通智能强力驱动
Strongly Powered by AbleSci AI