计算机科学
多样性(控制论)
数据科学
机制(生物学)
深度学习
人工智能
领域(数学)
符号
机器学习
认知科学
心理学
数学
算术
认识论
哲学
纯数学
作者
Gianni Brauwers,Flavius Frăsincar
标识
DOI:10.1109/tkde.2021.3126456
摘要
Attention is an important mechanism that can be employed for a variety of deep learning models across many different domains and tasks. This survey provides an overview of the most important attention mechanisms proposed in the literature. The various attention mechanisms are explained by means of a framework consisting of a general attention model, uniform notation, and a comprehensive taxonomy of attention mechanisms. Furthermore, the various measures for evaluating attention models are reviewed, and methods to characterize the structure of attention models based on the proposed framework are discussed. Last, future work in the field of attention models is considered.
科研通智能强力驱动
Strongly Powered by AbleSci AI