New paper: ERP mismatch response to phonological and temporal regularities in speech


We are excited to share our new publication “ERP mismatch response to phonological and temporal regularities in speech”, fresh off the press at Scientific Reports! In collaboration with the M-BIC Brain and Language Lab, our PhD candidate Alex Emmendorfer uses a passive oddball paradigm to investigate how our brain makes use of regularities in the phonological and temporal structure of speech.

School workshop: Vragen stellen aan het brein

Asking questions to the brain (Vragen stellen aan het brein): a 60min interactive workshop about the brain and rhythm. In February and May 2019 Katerina Kandylaki of BAND-Lab brought this workshop to 5th year class pupils at two local schools (Porta Mosana, Maastricht and Sophianum, Gulpen). She explained the basic concepts of imaging and stimulation methods (EEG, s/fMRI, TMS), the function of rhythm in language, and the basic idea of her current project NERHYMUS. This project investigates speech rhythm perception in musicians and non-musicians and is funded by the European Commission.

We had an EEG demonstration and a rhythm activity which were very well received by the teenagers. Even the “too-cool-for-school” ones were asking questions and joined the activities by the end of the workshop. A very rewarding and enjoyable experience for the researchers!

Many thanks to Kobus Lampe, master student in Neuropsychology for his support on the workshop. We also want to thank Isabelle Grosch (Marketing and Communications of the Faculty of Psychology and Neuroscience), and Ellen Krijnen and Tanja Peters (Marketing and Communications of Maastricht University) for connecting researchers and schools and promoting interactions.

Musicians improve nonmusicians’ timing abilities!

We are pleased to announce the publication of our article: The roles of musical expertise and sensory feedback in beat keeping and joint action

The full article can be viewed here.


Auditory feedback of actions provides additional information about the timing of one’s own actions and those of others. However, little is known about how musicians and nonmusicians integrate auditory feedback from multiple sources to regulate their own timing or to (intentionally or unintentionally) coordinate with a partner. We examined how musical expertise modulates the role of auditory feedback in a two-person synchronization–continuation tapping task. Pairs of individuals were instructed to tap at a rate indicated by an initial metronome cue in all four auditory feedback conditions: no feedback, self-feedback (cannot hear their partner), other feedback (cannot hear themselves), or full feedback (both self and other). Participants within a pair were either both musically trained (musicians), both untrained (nonmusicians), or one musically trained and one untrained (mixed). Results demonstrated that all three pair types spontaneously synchronized with their partner when receiving other or full feedback. Moreover, all pair types were better at maintaining the metronome rate with self-feedback than with no feedback. Musician pairs better maintained the metronome rate when receiving other feedback than when receiving no feedback; in contrast, nonmusician pairs were worse when receiving other or full feedback compared to no feedback. Both members of mixed pairs maintained the metronome rate better in the other and full feedback conditions than in the no feedback condition, similar to musician pairs. Overall, nonmusicians benefited from musicians’ expertise without negatively influencing musicians’ ability to maintain the tapping rate. One implication is that nonmusicians may improve their beat-keeping abilities by performing tasks with musically skilled individuals.


Society for the Neurobiology of Language, Québec City, August 2018

This summer, our lab member Alex Emmendorfer presented the results of her first PhD experiment at the 2018 meeting of the Society of the Neurobiology of Language in Québec City, Canada.

In this EEG study, Alex used Dutch pseudowords varying in their phonotactic probability and syllable stress pattern to examine the processing of formal and temporal prediction in speech perception in an oddball paradigm. She showed that both formal and temporal predictability modulate processing of speech stimuli. High predictability in both formal and temporal domains facilitates processing compared to low predictability, indexed by greater MMN amplitudes for first syllable stress and shorter latencies for high phonotactic probability.


Additionally, we are excited to announce that Lab Director Sonja Kotz will be joining the society’s board as Chair-Elect for the upcoming year! Congratulations Sonja!

Schultz MIDI Benchmarking Toolbox now available!

Ever wanted to test the timing of your MIDI  percussion pads, sound modules, and instrument patches? Well now you can! The SMIDIBT is available and the scripts are free to download:

If you have any comments or questions, feel free to contact Ben Schultz:



Schultz, B. G. (2018). The Schultz MIDI Benchmarking Toolbox for MIDI interfaces, percussion pads, and sound cards, Behavior Research Methods.

Beat Gestures and Syntactic Parsing: An ERP Study

The second part of a project investigating the relationship between prosody and speaker’s gestures, in collaboration with Lauren Fromont (Montreal) and Salvador Soto-Faraco (Barcelona), has been published now. If you want to read the whole story, check this out:

Fromont L.A., Soto-Faraco S., and Biau E. (2017) Searching High and Low: Prosodic Breaks Disambiguate Relative Clauses. Front. Psychol. 8:96.

Biau, E., Fromont, L. A. and Soto-Faraco, S. (2017), Beat Gestures and Syntactic Parsing: An ERP Study. Language Learning. doi:10.1111/lang.12257

Blind people learn metrical and nonmetrical rhythms differently than the sighted

The article can be read here for free (for a limited time):

When learning rhythms, sighted people tend to learn rhythms better when they induce a sense of beat (i.e., metrical rhythms) compared to when they don’t (i.e., nonmetrical rhythms). This experiment shows that blind people demonstrated the reverse trend; they learned nonmetrical rhythms better than metrical rhythms. These results suggest that the blind might be more sensitive to rhythms that are irregular, perhaps as a survival mechanism to detect changes in the environment that signal danger.

Breaking Research: How the vocal tract changes when speaking and singing

Benjamin Schultz, Joao Correia, and Michel Belyk have been examining changes in the vocal tract while speaking, whistling, and singing. Here, one of our singers aims to sing with as clear a tone as possible. Notice how the throat is nice and open:

Here, the same singer aims to mimic Louis Armstrong’s raspy vocal style. The throat pathway is considerably smaller and tense:

And here an example of whistling. Notice the difference in the muscular control of throat and tongue:

More videos will arrive soon – watch this space!