自动汇总
计算机科学
判决
杠杆(统计)
人工智能
多文档摘要
冗余(工程)
自然语言处理
变压器
答疑
物理
量子力学
电压
操作系统
作者
Chunlong Han,Jianzhou Feng,Haotian Qi
标识
DOI:10.1016/j.eswa.2023.121873
摘要
The Transformer-based summarization models rely solely on the attention mechanism for document encoding, making it difficult to accurately capture long-range dependencies in long documents due to the presence of attention redundancy. To address this issue, we propose an extractive summarization framework guided by a topic model (TopicSum) that utilizes a heterogeneous graph neural network to leverage the topic information as document-level features during the sentence selection process, thereby capturing the long-range dependencies among sentences. The sentence-level features in this topic model align with the basic unit of the extractive summarization task. Additionally, a memory mechanism is employed to dynamically store and update the memory module, reducing the potential of repetitive information guiding sentence selection. We evaluated the model on three large document datasets, namely Pubmed, arXiv, and GovReport, and achieved significantly higher Rouge scores than previous works, including extractive and abstractive models. Furthermore, our experiments demonstrate that recent highly regarded large language models such as ChatGPT are insufficient to handle the long document summarization task directly. The proposed approach in this paper exhibits sufficient competitiveness in terms of both generation quality and deployment conditions.
科研通智能强力驱动
Strongly Powered by AbleSci AI