Perceptual Science Series
Crossmodal Interactions between Corresponding Auditory and Visual Features
Dr. Karla Evans
Monday, December 04, 2006, 01:00pm - 02:00pm
Princeton University, Department of Psychology
Objects and events in the environment typically produce correlated input to several sensory modalities at once. It is important to understand the conditions under which the different sensory streams are integrated and the underlying mechanisms.
In my work I have attempted to determine the existence, the nature and the mechanism underlying crossmodal integration of multisensory non-speech auditory and visual stimuli based on their content correspondence. Using both behavioral and brain imaging methods, I explored the crossmodal integration based on feature correspondence between auditory pitch of sound and the visual features comprising vertical location, size, spatial frequency and contrast. In a series of psychophysical experiments I found natural mappings between the pitch of sounds and the visual features of vertical location, size and spatial frequency. The facilitatory effects observed during congruent bimodal stimulation hint at the automatic nature of these cross-modal interactions as well as their independence of attention.
In the fMRI study I found that the participating sensory cortices as well as multimodal sites (superior temporal sulcus and superior temporal gyrus) play a role in these crossmodal interactions. The perceptual gains are realized by amplification of the neuronal response in the participating sensory cortices, which is modulated by the congruency between the two features.