悲伤
运动捕捉
计算机科学
手势
脚本语言
愤怒
面部表情
运动(物理)
非语言交际
人机交互
幸福
自然语言处理
语音识别
人工智能
数据库
心理学
沟通
社会心理学
操作系统
作者
Carlos Busso,Murtaza Bulut,Chi-Chun Lee,Abe Kazemzadeh,Emily Mower,Samuel Kim,Jeannette N. Chang,Sungbok Lee,Shrikanth Narayanan
出处
期刊:Language Resources and Evaluation
日期:2008-11-04
卷期号:42 (4): 335-359
被引量:2892
标识
DOI:10.1007/s10579-008-9076-6
摘要
Since emotions are expressed through a combination of verbal and non-verbal channels, a joint analysis of speech and gestures is required to understand expressive human communication. To facilitate such investigations, this paper describes a new corpus named the “interactive emotional dyadic motion capture database” (IEMOCAP), collected by the Speech Analysis and Interpretation Laboratory (SAIL) at the University of Southern California (USC). This database was recorded from ten actors in dyadic sessions with markers on the face, head, and hands, which provide detailed information about their facial expressions and hand movements during scripted and spontaneous spoken communication scenarios. The actors performed selected emotional scripts and also improvised hypothetical scenarios designed to elicit specific types of emotions (happiness, anger, sadness, frustration and neutral state). The corpus contains approximately 12 h of data. The detailed motion capture information, the interactive setting to elicit authentic emotions, and the size of the database make this corpus a valuable addition to the existing databases in the community for the study and modeling of multimodal and expressive human communication.
科研通智能强力驱动
Strongly Powered by AbleSci AI