Emotion Dictionary Learning With Modality Attentions for Mixed Emotion Exploration

情绪分类 模式 任务(项目管理) 模态(人机交互) 计算机科学 集合(抽象数据类型) 情绪识别 认知心理学 光学(聚焦) 情态动词 语音识别 人工智能 心理学 社会科学 化学 物理 管理 社会学 高分子化学 光学 经济 程序设计语言
作者
Fang Liu,Pei Yang,Yezhi Shu,Fei Yan,Guanhua Zhang,Yong‐Jin Liu
出处
期刊:IEEE Transactions on Affective Computing [Institute of Electrical and Electronics Engineers]
卷期号:15 (3): 1289-1302 被引量:1
标识
DOI:10.1109/taffc.2023.3334520
摘要

Multi-modal emotion analysis, as an important direction in affective computing, has attracted increasing attention in recent years. Most existing multi-modal emotion recognition studies are targeted at a classification task that aims to assign a specific emotion category to a combination of several heterogeneous input data, including multimedia signals and physiological signals. Compared to single-class emotion recognition, a growing number of recent psychological evidence suggests that different discrete emotions may co-exist at the same time, which promotes the development of mixed-emotion recognition to identify a mixture of basic emotions. Although most current studies treat it as a multi-label classification task, in this work, we focus on a challenging situation where both positive and negative emotions are presented simultaneously, and propose a multi-modal mixed emotion recognition framework, namely EmotionDict. The key characteristics of our EmotionDict include the following. (1) Inspired by the psychological evidence that such a mixed state can be represented by combinations of basic emotions, we address mixed emotion recognition as a label distribution learning task. An emotion dictionary has been designed to disentangle the mixed emotion representations into a weighted sum of a set of basic emotion elements in a shared latent space and their corresponding weights. (2) While many existing emotion distribution studies are built on a single type of multimedia signal (such as text, image, audio, and video), we incorporate physiological and overt behavioral multi-modal signals, including electroencephalogram (EEG), peripheral physiological signals, and facial videos, which directly display the subjective emotions. These modalities have diverse characteristics given that they are related to the central or peripheral nervous system, and the motor cortex. (3) We further design auxiliary tasks to learn modality attentions for modality integration. Experiments on two datasets show that our method outperforms existing state-of-the-art approaches on mixed-emotion recognition.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
科目三应助可爱问寒采纳,获得10
刚刚
Sebastian完成签到,获得积分10
1秒前
1秒前
千跃发布了新的文献求助10
1秒前
1秒前
1秒前
猪猪hero发布了新的文献求助10
1秒前
奋斗甜瓜发布了新的文献求助20
2秒前
2秒前
隐形曼青应助哇嘎采纳,获得10
2秒前
九月发布了新的文献求助10
2秒前
3秒前
疯狂的巨蟹完成签到,获得积分10
3秒前
3秒前
薄饼哥丶发布了新的文献求助10
3秒前
4秒前
明芷蝶完成签到,获得积分10
4秒前
小羊同学发布了新的文献求助10
4秒前
4秒前
万能图书馆应助小灰灰采纳,获得10
4秒前
NexusExplorer应助Anovel采纳,获得10
4秒前
Tttttttt完成签到,获得积分10
4秒前
禾木发布了新的文献求助10
5秒前
5秒前
lf发布了新的文献求助10
6秒前
6秒前
wm发布了新的文献求助30
6秒前
lm完成签到 ,获得积分10
6秒前
yunlong发布了新的文献求助10
7秒前
7秒前
苗儿发布了新的文献求助30
7秒前
赘婿应助QIMU99采纳,获得10
7秒前
缥缈的青旋完成签到,获得积分10
7秒前
小熊完成签到,获得积分10
7秒前
7秒前
cwy发布了新的文献求助10
7秒前
慕青应助张博采纳,获得10
8秒前
科研通AI6.2应助lfl采纳,获得10
8秒前
呆萌紊完成签到,获得积分10
8秒前
王木木完成签到,获得积分10
9秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Kinesiophobia : a new view of chronic pain behavior 2000
Burger's Medicinal Chemistry, Drug Discovery and Development, Volumes 1 - 8, 8 Volume Set, 8th Edition 1800
Cronologia da história de Macau 1600
文献PREDICTION EQUATIONS FOR SHIPS' TURNING CIRCLES或期刊Transactions of the North East Coast Institution of Engineers and Shipbuilders第95卷 1000
BRITTLE FRACTURE IN WELDED SHIPS 1000
Lloyd's Register of Shipping's Approach to the Control of Incidents of Brittle Fracture in Ship Structures 1000
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 纳米技术 计算机科学 化学工程 生物化学 物理 复合材料 内科学 催化作用 物理化学 光电子学 细胞生物学 基因 电极 遗传学
热门帖子
关注 科研通微信公众号,转发送积分 6147435
求助须知:如何正确求助?哪些是违规求助? 7974172
关于积分的说明 16566196
捐赠科研通 5258101
什么是DOI,文献DOI怎么找? 2807652
邀请新用户注册赠送积分活动 1788007
关于科研通互助平台的介绍 1656664