计算机科学
主流
人工智能
关系(数据库)
任务(项目管理)
人工神经网络
计算科学与工程
深度学习
深层神经网络
机器学习
数据科学
自然语言处理
数据挖掘
哲学
神学
管理
经济
作者
Hailin Wang,Ke Qin,Rufai Yusuf Zakari,Guoming Lu,Jingwei Yin
标识
DOI:10.1007/s00521-021-06667-3
摘要
Knowledge is a formal way of understanding the world, providing human-level cognition and intelligence for the next-generation artificial intelligence (AI). An effective way to automatically acquire this important knowledge, called Relation Extraction (RE), plays a vital role in Natural Language Processing (NLP). To date, there are amount of studies for RE in previous works, among which these technologies based on deep neural networks (DNNs) have become the mainstream direction of this research. In particular, the supervised and distant supervision methods based on DNNs are the most popular and reliable solutions for RE, whose various evolutions on structure and settings have affected this task. Understanding the model structure and related settings will give the researchers a deep insight into RE. However, little research has been done on them. Hence, this paper starts from these two points and carries out analysis around the mainstream research routes, supervised and distant supervision. Meanwhile, we classify all related works according to the evolution of model structure to facilitate the analysis. Finally, we discuss some challenges of RE and give out our conclusion.
科研通智能强力驱动
Strongly Powered by AbleSci AI