Neural mechanisms of audio-visual temporal integration
Multisensory integration is the remarkable ability of our brain to combine information from different senses in a unified percept, resulting in a more complete and accurate representation of the real world. However, in the complex and ever-changing environment we deal with in our everyday life, a crucial aspect of multisensory integration is that the brain has to make split-second decisions about which sensory inputs pertain to the same relevant event in the environment and which do not. One of the most important factors determining whether two or more sensory inputs are attributed to the same external source is their temporal proximity. My work focuses on understanding the neural mechanisms underpinning temporal integration of audio-visual events. In the first part of my talk I will show behavioural and EEG evidence supporting the idea that not one, but multiple mechanisms might be implemented in the brain to evaluate audio-visual synchrony based on the relative timing of auditory and visual stimuli. The second part will feature an EEG and brain stimulation study pointing to oscillatory activity in the alpha band (8-12Hz) as a potential mechanism for generating “temporal units” to promote audio-visual integration vs. separation. Finally, I will present more recent evidence pointing to a role of pre-stimulus oscillatory power in determining the strength of audio-visual integration.
Cecere R. et al. (2015) Individual differences in alpha frequency drive crossmodal illusory perception.
Cecere R.et al. (2017) Being First Matters: Topographical Representational Similarity Analysis of ERP Signals Reveals Separate Networks for Audiovisual Temporal Binding Depending on the Leading Sense.