Research on Emotion Recognition Based on Parameter Transfer and Res2Net
计算机科学
传输(计算)
人工智能
语音识别
并行计算
作者
Yan Wu,Wei Liu,Qi Li
标识
DOI:10.1109/icftic59930.2023.10455873
摘要
In the field of emotion recognition, physiological signals have gradually become a hot object of research because they can objectively reflect real emotions. However, it is difficult to describe emotions completely and accurately with a single signal. Multiple physiological signal fusion models establish a unified classification model through the consistency and complementarity of different physiological signals to improve recognition performance. However, the current multi-modal physiological signal emotion recognition has insufficient information exchange. Aiming at the above problems, this paper proposes a multi-modal physiological signal emotion classification model. Feature extraction of four modes of physiological signals of EEG, EOG, EMG and Skin Electricity is carried out by the method of parameter migration, which saves network training time and improves learning performance. Emotion recognition of multi-modal physiological signals is realized by using Res2Net to more fully mine emotional feature information and fuse it effectively. The experimental results show that the proposed model achieves 95.27% and 94.71% accuracy in the binary classification task of arousal and potency of the DEAP dataset, respectively.