计算机科学
修剪
管道(软件)
人工智能
脑电图
还原(数学)
机器学习
任务(项目管理)
卷积(计算机科学)
深度学习
数据压缩
模式识别(心理学)
语音识别
人工神经网络
心理学
几何学
数学
管理
精神科
农学
经济
生物
程序设计语言
作者
Wufeng Rao,Sheng-hua Zhong
标识
DOI:10.1109/ijcnn54540.2023.10192035
摘要
With the development of deep learning on EEG-related tasks, the complexity of learning models has gradually increased. Unfortunately, the insufficient amount of EEG data limits the performance of complex models. Thus, model compression becomes an option to be seriously considered. So far, in EEG- related tasks, although some models used lightweight means such as separable convolution in their models, no existing work has directly attempted to compress the EEG model. In this paper, we try to investigate the state-of-the-art network pruning methods on commonly used EEG models for the emotion recognition task. In this work, we make several surprising observations that contradict common beliefs. Training a pruned model from scratch outperforms fine-tuning a pruned model with inherited weights, which means that the pruned structure itself is more important than the inherited weights. We can ignore the entire pruning pipeline and train the network from scratch using the predefined network architecture. We substantially reduce the computational resource overhead of the model while maintaining accuracy. In the best case, we achieve a 62.3% reduction in model size and a 64.3% reduction in computing operations without accuracy loss.
科研通智能强力驱动
Strongly Powered by AbleSci AI