Multimodal integration of haptics, speech, and affect in an educational environment


Share/Save/Bookmark

Nijholt, Anton (2004) Multimodal integration of haptics, speech, and affect in an educational environment. In: International Conference on Computing, Communications and Control Technologies, CCCT, August 14-17, 2004, Austin, Texas.

[img]PDF
Restricted to UT campus only
: Request a copy
347Kb
Abstract:In this paper we investigate the introduction of haptics in a multimodal tutoring environment. In this environment a haptic device is used to control a virtual injection needle and speech input and output is provided to interact with a virtual tutor, available as a talking head, and a virtual patient. We survey the agent-based architecture of the system and discuss the different interaction modalities. One of the agents, the virtual tutor monitors the actions of the student, provides feedback and is able to demonstrate. Incorporated is a simple emotion model that the tutor tries to maintain and update by considering the student’s actions and its progress. The model allows the tutor to show affective behavior to the student.
Item Type:Conference or Workshop Item
Faculty:
Electrical Engineering, Mathematics and Computer Science (EEMCS)
Research Group:
Link to this item:http://purl.utwente.nl/publications/63396
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page

Metis ID: 221647