虚拟现实
计算机科学
脑电图
价(化学)
人工智能
互动性
情绪识别
模式识别(心理学)
多媒体
心理学
量子力学
精神科
物理
作者
Guanxiong Pei,Cunhang Fan,Taihao Li,Jia Jin,Rui Wang
标识
DOI:10.1145/3585542.3585544
摘要
Virtual reality technology provides a strong sense of immersion and interactivity. It is widely used in the fields of anxiety relief, fear therapy, and depression regulation. However, objectively evaluating the emotional intervention effect of virtual reality technology is a difficult problem. The main purpose of this paper is to explore the use of EEG signals to identify individual emotional states in virtual reality scenarios and to improve the computational efficiency and recognition accuracy of emotional valence. To induce the target emotional state of the participants, we established a relatively standard emotion-induced virtual reality video library. The EEG data of the participants were collected synchronously as they watched the virtual reality video. The results show that the emotion recognition performance of multiple features (energy spectrum, differential entropy, differential asymmetry, and rational asymmetry) is better than that of a single feature. The radial basis function neural network (RBFNN) performed better than the deep belief network (DBN). RBFNN achieves the highest average classification accuracy of 91.1%. By combining the feature selection (F-test) method with the RBFNN, an ideal classification performance can be maintained with computational efficiency improvements. Furthermore, it is demonstrated that the features extracted from the theta band outperform features extracted from other bands in emotional valence decoding. These results may contribute to the application of EEG-based affective computing technology in the field of psychological rehabilitation and assessment.
科研通智能强力驱动
Strongly Powered by AbleSci AI