计算机科学
人工神经网络
熵(时间箭头)
人工智能
修剪
深度学习
随机森林
计算
量化(信号处理)
机器学习
密码学
循环神经网络
数据挖掘
模式识别(心理学)
算法
生物
物理
量子力学
农学
作者
Haohao Li,Jianguo Zhang,Zhihu Li,Juan Liu,Yu Wang
标识
DOI:10.1109/tifs.2023.3240859
摘要
In the field of information security, the unpredictability of random numbers plays determinant role according to the security of cryptographic systems. However, limited by the capability of pattern recognition and data mining, statistical-based methods for random number security assessment can only detect whether there are obvious statistical flaws in random sequences. In recent years, some machine learning-based techniques such as deep neural networks and prediction-based methods applied to random number security have exhibited superior performance. Concurrently, the proposed deep learning models bring out issues of large number of parameters, high storage space occupation and complex computation. In this paper, for the challenge of random number security analysis: building high-performance predictive models, we propose an effective analysis method based on pruning and quantized deep neural network. Firstly, we train a temporal pattern attention-based long short-term memory (TPA-LSTM) model with complex structure and good prediction performance. Secondly, through pruning and quantization operations, the complexity and storage space occupation of the TPA-LSTM model were reduced. Finally, we retrain the network to find the best model and evaluate the effectiveness of this method using various simulated data sets with known min-entropy values. By comparing with related work, the TPA-LSTM model provides more accurate estimates: the relative error is less than 0.43%. In addition, the model weight parameters are reduced by more than 98% and quantized to 2 bits (compression over 175x) without accuracy loss.
科研通智能强力驱动
Strongly Powered by AbleSci AI