Searching for Prototypical Facial Feedback Signals


Heylen, D.K.J. and Bevacqua, E. and Tellier, M. and Pelachaud, C. (2007) Searching for Prototypical Facial Feedback Signals. In: C. Pelachaud & J-C. Martin & E. André & G. Chollet & D. Pelé (Eds.), Intelligent Virtual Agents. Lecture Notes in Computer Science, 4722/2007 . Springer Verlag, Berlin, pp. 147-153. ISBN 9783540749967

[img] PDF
Restricted to UT campus only

Abstract:Embodied conversational agents should be able to provide
feedback on what a human interlocutor is saying. We are compiling a list of facial feedback expressions that signal attention and interest, grounding and attitude. As expressions need to serve many functions at the same time and most of the component signals are ambiguous, it is important to get a better idea of the many to many mappings between displays and functions. We asked people to label several dynamic expressions as a probe into this semantic space. We compare simple signals and combined signals in order to find out whether a combination of signals can have a meaning on its own or not, i. e. the meaning of single signals is different from the meaning attached to the combination of these signals. Results show that in some cases a combination of signals alters the perceived meaning of the backchannel.
Item Type:Book Section
Electrical Engineering, Mathematics and Computer Science (EEMCS)
Research Group:
Link to this item:
Official URL:
Export this item as:BibTeX
HTML Citation
Reference Manager


Repository Staff Only: item control page

Metis ID: 245987