RESEARCH

HOME RESEARCH
Behavior Computing
States and Traits
Mental Health
Spoken Dialogs
Speech and Language
Automatic Classification of Married Couples' Behavior using Audio Features
Abstract
In this work, we analyzed a 96-hour corpus of married couples spontaneously interacting about a problem in their relationship. Each spouse was manually coded with relevant session-level perceptual observations (e.g., level of blame toward other spouse, global positive affect), and our goal was to classify the spouses' behavior using features derived from the audio signal. Based on automatic segmentation, we extracted prosodic/spectral features to capture global acoustic properties for each spouse. We then trained gender-specific classifiers to predict the behavior of each spouse for six codes. We compare performance for the various factors (across codes, gender, classifier type, and feature type) and discuss future work for this novel and challenging corpus.
Figures
Block diagram of the automatic alignment procedure
Block diagram of the automatic alignment procedure
Keywords
behavioral signal processing | human behavior analysis | couples therapy | prosody | emotion recognition
Authors
Chi-Chun Lee
Publication Date
2010/09/26
Conference
Interspeech
Interspeech 2010
DOI
10.21437/Interspeech.2010-574
Publisher
ISCA