Dimensional Emotion Recognition from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners


Share/Save/Bookmark

Gunes, Hatice and Pantic, Maja (2010) Dimensional Emotion Recognition from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners. In: 10th International Conference on Intelligent Virtual Agents, IVA 2010, 20-22 September 2010, Philadelphia, PA, USA.

[img]PDF
Restricted to UT campus only
: Request a copy
289Kb
Abstract:This paper focuses on dimensional prediction of emotions from spontaneous conversational head gestures. It maps the amount and direction of head motion, and occurrences of head nods and shakes into arousal, expectation, intensity, power and valence level of the observed subject as there has been virtually no research bearing on this topic. Preliminary experiments show that it is possible to automatically predict emotions in terms of these five dimensions (arousal, expectation, intensity, power and valence) from conversational head gestures. Dimensional and continuous emotion prediction from spontaneous head gestures has been integrated in the SEMAINE project [1] that aims to achieve sustained emotionally-colored interaction between a human user and Sensitive Artificial Listeners
Item Type:Conference or Workshop Item
Copyright:© 2010 Springer
Faculty:
Electrical Engineering, Mathematics and Computer Science (EEMCS)
Research Group:
Link to this item:http://purl.utwente.nl/publications/75933
Official URL:http://dx.doi.org/10.1007/978-3-642-15892-6_39
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page