Chapter 6

Modeling Facial Expressions of Emotions 1

6.1. Expressive conversational agents

As users’ expectations grow in terms of friendliness of human-machine interfaces, the ubiquitous and affective computing domains investigate the use of embodied conversational agents (ECAs) that can express emotion. ECAs are virtual agents that can autonomously communicate verbally and non-verbally with a user. Focus on this area of research is driven by a desire to improve human–machine interactions through the use of social and emotional signals. To express emotions, the agent needs to have access to a model that identifies a method of communication that can be understood by humans and also be endowed with non-verbal communication means.

Studies have shown that humans communicate their emotions via a number of modalities and that the face is a primary communication method [DAR 72, DUC 99, EKM 72, KAI 01]. In human–machine interaction, communicating with an emotionally expressive ECA creates a more satisfying interaction than one with an expressively neutral agent [WAL 94]. However, more than merely retaining the interest of the user, non-verbal expressions can also be useful for clarifying verbal texts [ELL 97]. This is evident with situation-specific responses where inappropriate expressions can lead to undesired consequences [BED 05, WAL 94]. Similarly, the same agent, depending on the quality of its facial expressions, may be perceived as more or less believable, which may in turn reduce ...

Get Emotion-Oriented Systems now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.