Research

Audiovisual integration of emotional signals from music improvisation does not depend on temporal correspondence


Reference:

Petrini, K., McAleer, P. and Pollick, F., 2010. Audiovisual integration of emotional signals from music improvisation does not depend on temporal correspondence. Brain Research, 1323, pp. 139-48.

Related documents:

This repository does not currently have the full-text of this item.
You may be able to access a copy if URLs are provided below. (Contact Author)

Official URL:

http://dx.doi.org/10.1016/j.brainres.2010.02.012

Abstract

In the present study we applied a paradigm often used in face-voice affect perception to solo music improvisation to examine how the emotional valence of sound and gesture are integrated when perceiving an emotion. Three brief excerpts expressing emotion produced by a drummer and three by a saxophonist were selected. From these bimodal congruent displays the audio-only, visual-only, and audiovisually incongruent conditions (obtained by combining the two signals both within and between instruments) were derived. In Experiment 1 twenty musical novices judged the perceived emotion and rated the strength of each emotion. The results indicate that sound dominated the visual signal in the perception of affective expression, though this was more evident for the saxophone. In Experiment 2 a further sixteen musical novices were asked to either pay attention to the musicians' movements or to the sound when judging the perceived emotions. The results showed no effect of visual information when judging the sound. On the contrary, when judging the emotional content of the visual information, a worsening in performance was obtained for the incongruent condition that combined different emotional auditory and visual information for the same instrument. The effect of emotionally discordant information thus became evident only when the auditory and visual signals belonged to the same categorical event despite their temporal mismatch. This suggests that the integration of emotional information may be reinforced by its semantic attributes but might be independent from temporal features.

Details

Item Type Articles
CreatorsPetrini, K., McAleer, P. and Pollick, F.
DOI10.1016/j.brainres.2010.02.012
Uncontrolled Keywordsacoustic stimulation,adult,analysis of variance,attention,auditory perception,emotions,female,humans,male,music,photic stimulation,time factors,visual perception
DepartmentsFaculty of Humanities & Social Sciences > Psychology
RefereedYes
StatusPublished
ID Code41427

Export

Actions (login required)

View Item