RESEARCH

HOME RESEARCH
Behavior Computing
Speech and Language
Other: Signal Modeling for Understanding
Computational Analyses of Thin-Sliced Behavior Segments in Session-Level Affect Perception
Abstract
The ability to accurately judge another person's emotional states with a short duration of observations is a unique perceptual mechanism of humans, termed as the thin-sliced judgment. In this work, we propose a computational framework based on mutual information to identify the thin-sliced emotion-rich behavior segments within each session and further use these segments to train the session-level affect regressors. Our proposed thin-sliced framework obtains regression accuracies measured in Spearman correlations of 0.605, 0.633, and 0.672 on session-level attributes of activation, dominance, and valence, respectively. It outperforms framework using data of the entire session as baseline. The significant improvement in the regression correlations reinforces the thin-sliced nature of human emotion perception. By properly extracting these emotion-rich behavior segments, we obtain not only an improved overall accuracy but also bring additional insights. Specifically, our detailed analyses indicate that this thin-sliced nature in emotion perception is more evident for attributes of activation and valence, and the within-session time distribution of emotion-salient behavior is located more toward the ending portion. Lastly, we observe that there indeed exists a certain set of behavior types that carry high emotion-related content, and this is especially apparent in the extreme emotion levels.
Figures
A complete workflow of our proposed global emotion regression framework: 1) Identification of global emotion-rich behavior segment as our thin slice behavior segments for every session using mutual information, 2) session-level feature encoding of these emotion-salient behavior segments, and 3) support vector regression trained on these features to perform multimodal recognition of activation, valence, and dominance.
A complete workflow of our proposed global emotion regression framework: 1) Identification of global emotion-rich behavior segment as our thin slice behavior segments for every session using mutual information, 2) session-level feature encoding of these emotion-salient behavior segments, and 3) support vector regression trained on these features to perform multimodal recognition of activation, valence, and dominance.
Keywords
Emotion recognition | multimodal behaviors | thin-sliced affect perception | mutual information
Authors
Publication Date
2018/03/16
Journal
IEEE Transactions on Affective Computing
IEEE Transactions on Affective Computing
DOI
10.1109/Taffc.2018.2816654
Publisher
IEEE