
178 Coverbal Synchrony in Human-Machine Interaction
The resulting nine paired stimuli were then presented through
a Power Point presentation to 42 subjects equally balanced for
gender and aged between 19 and 31 years (mean = 23.98, SD = 3.49).
The subjects were asked to judge, while reading the sentences, the
emotional quality of the paired melody attributing to it one of the
following labels: happiness, sadness, anger, I don’t know, another emotional
label.
Results show that subjects assign congruent labels to congruent
pairs, whereas they are significantly less likely to do so when the
Melody (M) does not match the Text (T) (Table 1).
As T ...