计算机科学
基础(拓扑)
知识管理
任务(项目管理)
接头(建筑物)
知识库
情报检索
人机交互
人工智能
万维网
过程(计算)
作者
An-Zi Yen,Hen-Hsen Huang,Hsin-Hsi Chen
标识
DOI:10.1016/j.ipm.2019.102148
摘要
Abstract People are used to log their life on social media platforms. In this paper, we aim to extract life events by leveraging both visual and textual information shared on Twitter and construct personal knowledge bases of individuals. The issues to be tackled include (1) not all text descriptions are related to life events, (2) life events in a text description can be expressed explicitly or implicitly, (3) the predicates in the implicit life events are often absent, and (4) the mapping from natural language predicates to knowledge base relations may be ambiguous. A multimodal joint learning approach trained on both text and images from social media posts shared on Twitter is proposed to detect life events in tweets and extract event components including subjects, predicates, objects, and time expressions. Finally, the extracted information is transformed to knowledge base facts. The evaluation is performed on a collection of lifelogs from 18 Twitter users. Experimental results show our proposed system is effective in life event extraction, and the constructed personal knowledge bases are expected to be useful to memory recall applications.
科研通智能强力驱动
Strongly Powered by AbleSci AI