RESEARCH

HOME RESEARCH
Behavior Computing
Spoken Dialogs
States and Traits
Speech and Language
Modeling mutual influence of interlocutor emotion states in dyadic spoken interactions
Abstract
In dyadic human interactions, mutual influence - a person's influence on the interacting partner's behaviors - is shown to be important and could be incorporated into the modeling framework in characterizing, and automatically recognizing the participants' states. We propose a Dynamic Bayesian Network (DBN) to explicitly model the conditional dependency between two interacting partners' emotion states in a dialog using data from the IEMOCAP corpus of expressive dyadic spoken interactions. Also, we focus on automatically computing the Valence-Activation emotion attributes to obtain a continuous characterization of the participants' emotion flow. Our proposed DBN models the temporal dynamics ofthe emotion states as well as the mutual influence between speakers in a dialog. With speech based features, the proposed network improves classification accuracy by 3.67% absolute and 7.12% relative over the Gaussian Mixture Model (GMM) baseline on isolated turn-by-turn emotion classification.
Figures
K-Means Clustering Output ofValence-Activation.
K-Means Clustering Output ofValence-Activation.
Keywords
emotion recognition | mutual influence | Dynamic Bayesian Network | dyadic interaction
Authors
Chi-Chun Lee
Publication Date
2009/09/06
Conference
Interspeech
Interspeech 2009
DOI
10.21437/Interspeech.2009-480
Publisher
ISCA