隐写分析技术
计算机科学
职位(财务)
人工智能
卷积(计算机科学)
变量(数学)
隐写术
模式识别(心理学)
算法
数据挖掘
图像(数学)
数学
财务
人工神经网络
数学分析
经济
作者
Yihao Wang,Ru Zhang,Jianyi Liu
标识
DOI:10.1016/j.jisa.2023.103512
摘要
Text length varies in social networks, such as IMDB long text, Twitter short text, and long–short mixed text. For these complex situations, the time series, convolution, or fine-tuned BERT models are used in existing text steganalysis methods almost. However, these methods do not simultaneously consider higher detection accuracy and lower training time at the same time. To alleviate this dilemma, this paper proposes a novel text steganalysis method. First, the proposed method maps words into a semantic space containing position information. Second, a variable parameter attention layer scaled appropriately according to text length is designed, it achieves the purpose that the entire parameter amount of the model is not redundant and can ensure effective detection. Finally, the steganalysis features are enhanced by the residual linear layer. For long, short, and mixed text datasets, comparing experiments show that the proposed method has higher detection accuracy, fewer parameters, and shorter training time than existing methods. Among them, the advantage of this method is more obvious for long and mixed texts.
科研通智能强力驱动
Strongly Powered by AbleSci AI