自动汇总
计算机科学
稳健性(进化)
人工神经网络
人工智能
编码器
尖峰神经网络
循环神经网络
深度学习
短时记忆
变压器
机器学习
生物化学
化学
物理
量子力学
电压
基因
操作系统
作者
A. Xueyi Hao,Bohan Li,Ernan Li,Cunbin Li,Dalin Qin
标识
DOI:10.1109/ic-nidc59918.2023.10390845
摘要
Due to information overload, ATS(Automatic Text Summarization) is becoming increasingly important. However, nowadays DNN(Deep Neural Network) and pre-trained language models widely used in ATS cannot fully mimic the operating mechanism of neurons in the human brain. In this paper, we introduce SNN(Spiking Neural Network) into ATS system due to the high biological rationality, low power consumption, and high robustness of SNN. A new extractive summarization model called BERT(Bidirectional Encoder Representation from Transformers) + LSNN(Long short-term memory Spiking Neural Network) is proposed and a set of experiments on CNN/Daily Mail are implemented. Compared with the existing BERT + LSTM(Long Short-Term Memory) model, BERT + LSNN not only improves performance, but also verifies SNN's advantages of low power consumption and high robustness. This work is very promising to expand related research in text summarization.
科研通智能强力驱动
Strongly Powered by AbleSci AI