RESEARCH

HOME RESEARCH
Behavior Computing
Spoken Dialogs
Speech and Language
Affect
Physiology
NNIME: The NTHU-NTUA Chinese interactive multimodal emotion corpus
Abstract
The increasing availability of large-scale emotion corpus along with the advancement in emotion recognition algorithms have enabled the emergence of next-generation human-machine interfaces. This paper describes a newlycollected multimodal corpus, i.e., the NTHU-NTUA Chinese Interactive Emotion Corpus (NNIME). The database is a result of the collaborative work between engineers and drama experts. This database includes recordings of 44 subjects engaged in spontaneous dyadic spoken interactions. The multimodal data includes approximately 11-hour worth of audio, video, and electrocardiogram data recorded continuously and synchronously. The database is also completed with a rich set of emotion annotations on discrete and continuous-in-time annotation by a total of 49 annotators. Thees emotion annotations include a diverse perspectives: peer-report, director-report, self-report, and observer-report. This carefully-engineered data collection and annotation processes provide an additional valuable resource to quantify and investigate various aspects of affective phenomenon and human communication. To our best knowledge, the NNIME is one of the few large-scale Chinese affective dyadic interaction database that have been systematically collected, organized, and to be publicly-released to the research community.
Figures
It depicts an actual snapshot of two different recording sessions extracted from the stage front-facing video camcorder (left and right). The middle depicts the camera setup in relation to the stage
It depicts an actual snapshot of two different recording sessions extracted from the stage front-facing video camcorder (left and right). The middle depicts the camera setup in relation to the stage
An example of the collected raw ECG signal with the identified R peaks in one of our sessions.
An example of the collected raw ECG signal with the identified R peaks in one of our sessions.
Keywords
electrocardiography | emotion recognition | human computer interaction | interactive systems
Publication Date
2017/10/23
Conference
2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII)
2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII)
DOI
10.1109/acii.2017.8273615
Publisher
IEEE