About Brainglot
Brainglot personnel
Research Areas and Projects
Consolider CogNeuro Seminar Series
Consolider Meetings and Workshops
Technical and Administrative Support Services
Research Reports
Consolider Groups
Speech Perception, Production and Bilingualism
Cognitive Neuroscience of Auditory Perception and Attention
Group of Attention, Action and Perception
Computational and Theoretical Neuroscience
Neuropsychology and Functional Neuroimaging
Grammar and Bilingualism

Multisensory integration outside the language domain

Project coordinator: Salvador Soto-Faraco.

Project members: Gustavo Deco, Joan López-Moliner, Alexis Pérez, Mario Pannunzi, Antonia Najas, Elena Azañón, Karla Camacho, Philip Jaekl. 


We introduced measures of multisensory integration for non-linguistic aspects of perception and decision making. First, we developed several psychophysical tests of multisensory integration in the sensory domain in order to discover the basic principles of integration (Subproject 1). Second, we addressed perceptual decision- making in a multisensory context (Subproject 2). These paradigms should provide the basis for comparison between mono- and bilingual populations in non-linguistic domains.

SUBPROJECT 1:  Audio-visual enhancement in non-speech contexts

(a) A basic question, which is still to be conclusively resolved in the multisensory literature, is this: Can auditory signals influence low-level sensory processing of visual information? We addressed this issue in three different psychophysical studies. First, we addressed super-additivity in visual contrast detection thresholds where visual targets were combined with near-threshold auditory stimuli. Second, we measured RTs to visual stimuli under similar conditions. The analysis, based on Piéron curves (see Fig . 1), revealed no sensory benefits from audiovisual combination in either study. Finally, in a new reaction time study addressing the role of sounds in reducing visual uncertainty, we found that an auditory signal can reduce sensitivity to the different visual channels, making visual perception more unspecific when there is certainty about one feature.



Figure 1. Detection RTs for 3 different spatial frequencies fitted to Piéron curves. Illustrative examples of the stimuli for each spatial frequency condition and contrast range shown on top. Insets plot the slope of the curve and the horizontal asymptote (motor time).


(b) Jaekl & Soto-Faraco (in press) developed a paradigm to test enhancements in visual contrast thresholds by sounds (supra-threshold). We used the ‘Steady-vs.-Pulsed pedestal’ paradigm to separate spatial contrast signatures of the Magno- and Parvo-cellular pathways. We found audio-visual enhancement of visual contrast thresholds only under the conditions most favorable to the M-pathway. This dissociation raises interesting questions regarding the components being enhanced during audiovisual speech perception. To pursue these questions, a new study is currently underway to separate M- and P-systems during audiovisual speech processing using point light displays.

SUBPROJECT 2:  Audio-tactile integration as a model for decision making.  Soto-Faraco & Deco (2009) proposed a research programme based on the combination of psychophysical, physiological and computational approaches to address perceptual decision-making mechanisms in multisensory contexts. The auditory-tactile domain was chosen because there is a great deal of physiological knowledge about vibratory processing, and because vibration is naturally conveyed by the skin and the ear from the lowest levels of processing.

(a) We tested psychophysically the discrimination of vibrations in the flutter range (~10-50 Hz) given that vibratory patterns within this range are represented in the firing rates of the neurons in the sensory cortices for both somatosensory and auditory stimuli. If early audio-tactile integration takes place, we would expect a shift in the psychophysical curve corresponding to the detection of a vibrotactile stimulus as a function of frequency similarity. We devised a tactile detection threshold procedure (2IFC) of vibrotactile stimuli in the flutter range either in isolation or accompanied by a concurrent 31Hz flutter sound (Figure 2). Tactile vibration thresholds (in microns) are measured under these conditions for 13, 31 and 49Hz stimuli. The results reveal both a general unspecific effect of sound (common to noise and tones) and a small yet consistent frequency-specific effect.


Figure 2. (a) Schematic illustration of the sequence of a trial with sound (the 49 Hz tactile vibration present in the first interval). (b) Results for one participant. Tactile threshold improves with frequency (x axis) and is better with accompanying sounds (green and blue) than without (red). 


(b) We are also modeling the behavioral data by means of biophysically realistic neurodynamical models (e.g., Deco and Rolls, 2006). We plan to implement a simulated network for each of the two alternative theoretical possibilities with the constraint of generating the same psychophysical results reported in our experiment. We expect that the networks will develop different functions and properties depending on the level at which the sensory integration takes place, and in particular we expect different timing in the output that could allow us to generate predictions to prove with future experiments involving neurophysiological measures. 



  1. The role of attention in multisensory integration (Talsma et al.)
  2. The temporal and neural dynamics of tactile localization (Azañon, Camacho, et al.)