Emotion Perception and Recognition 1
Human beings play a vital role in the perception and categorization of emotions using audio, visual and audiovisual stimuli. When implementing an automatic emotion-recognition system, humans are used as a reference point to validate annotation schemes, determine characteristic emotion cues and provide a basis for evaluating automatic classifications. In recent years, parallel to the development of verbal communication technologies, such as automatic speech recognition, interest in human perception has witnessed a revival. Similarly, in connection with the emergence of these fields, there has been a renewed interest in perceptual mechanism understanding. It is worth noting that human perception has served as a reference point that allows us to measure the performances of different automatic systems, as demonstrated by comparative studies into speech transcription [LIP 03, SCH 07] and language identification [MAD 02, MUT 94].
In the field of speech technology, affective computing [PIC 97] (or emotion-oriented computing [SCH 06a]) is a relatively recent development. However, the study of emotions in relation to speech technologies is a less recent subject when considering, for example, the difficulties involved in automatically transcribing emotional speech [ATH 05]. As the affective sciences have become a discipline in themselves over the past decade, emotional phenomena and their physical manifestations have also ...