I am a postdoctoral researcher working at the Basic & Applied NeuroDynamics Lab (Band lab, Maastricht University). After having received my PhD in Biomedicine at the Pompeu Fabra University (Barcelona), I was awarded a Marie Curie Fellowship to move to the Band lab under the supervision of Sonja Kotz.
In normal conversations, listeners simultaneously perceive speech prosody conveyed in both visual (speaker’s body) and auditory (verbalization) modalities. The main goal of my current work is to know how they integrate two superimposed rhythmic streams conveyed in different formats to facilitate audiovisual speech segmentation. Using electrophysiological (ERP/EEG), and neuroimaging (fMRI) techniques, I want to examine in particular the impact of speaker’s body movements naturally aligned with verbal prosody on temporal coding. First, I want to measure how the low-frequency activities (i.e. delta-theta and beta oscillations) tracking the time-frequency architecture of continuous AV speech are modulated and interact, depending on the accompanying visual information. Second, I want to test whether an enhancement of delta-theta/beta activities and their cross-frequency coupling may reflect a sensorimotor network engagement, potentially increased by the processing of visual rhythms superimposed on equivalent rhythms conveyed in verbal prosody. Finally, I want to test whether the concomitant perception of visual rhythms brought by body information can compensate the well-established timing deficits in auditory modality (i.e. Parkinson’s disease), by engaging diversely or alternative neural correlates as compared to healthy population.
- Audiovisual speech perception
- Social interactions
- Auditory and visual prosody
- Temporal processing in AV speech
Fromont, L.A., Soto-Faraco, S., & Biau, E. (2017). Searching high and low: Prosodic breaks disambiguate Relative clauses. Frontiers in Psychology, 8:96.
Biau, E., Fernandez, L.M., Holle, H., Avila, C., & Soto-Faraco, S. (2016). Hand gestures as visual prosody: BOLD responses to audio-visual alignment are modulated by the communicative nature of the stimuli. NeuroImage, 132, 129-137.
Biau, E., & Soto-Faraco, S. (2015). Synchronization by the hand: The sight of gestures modulates low-frequency activity in brain responses to continuous speech. Frontiers in Human Neuroscience, 9, 527.
Biau, E., Torralba , M., Fuentemilla, L., de Diego Balaguer, R., & Soto-Faraco, S. (2015). Speaker’s hand gestures modulate speech perception through phase resetting of ongoing neural oscillations. Cortex, 68, 76-85.
Biau, E., & Soto-Faraco, S. (2013). Beat gestures modulate auditory integration in speech perception. Brain and Language, 124(2), 143 52.
2011 – 2015 PhD in Biomedecine at the Multisensory Research Group, Centre for Brain and Cognition (CBC), Pompeu Fabra University, Barcelona (Spain).
2008 – 2009 M.S. Cognitive Neuroscience and Behavioural, Integrative Biology, Pierre and Marie Curie University, Paris XI (France).
2016 – 2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2015), at the University of Maastricht (Netherlands).
2011 – 2015 Pre-doctoral Fellowship provided by the Spanish Government (FPI BES-2011-043870), at the Pompeu Fabra University of Barcelona (Spain).
2013 – 2014 National Sub-Program for Mobility Grant, provided by the Spanish Government (EEBB-I-13-06780), at Hull University (UK).
2010 – 2011 Research grant for postgraduate students provided by the Department of Communication and Technologies, Pompeu Fabra University of Barcelona (Spain).