Musicians improve nonmusicians’ timing abilities!

We are pleased to announce the publication of our article: The roles of musical expertise and sensory feedback in beat keeping and joint action

The full article can be viewed here.

Abstract

Auditory feedback of actions provides additional information about the timing of one’s own actions and those of others. However, little is known about how musicians and nonmusicians integrate auditory feedback from multiple sources to regulate their own timing or to (intentionally or unintentionally) coordinate with a partner. We examined how musical expertise modulates the role of auditory feedback in a two-person synchronization–continuation tapping task. Pairs of individuals were instructed to tap at a rate indicated by an initial metronome cue in all four auditory feedback conditions: no feedback, self-feedback (cannot hear their partner), other feedback (cannot hear themselves), or full feedback (both self and other). Participants within a pair were either both musically trained (musicians), both untrained (nonmusicians), or one musically trained and one untrained (mixed). Results demonstrated that all three pair types spontaneously synchronized with their partner when receiving other or full feedback. Moreover, all pair types were better at maintaining the metronome rate with self-feedback than with no feedback. Musician pairs better maintained the metronome rate when receiving other feedback than when receiving no feedback; in contrast, nonmusician pairs were worse when receiving other or full feedback compared to no feedback. Both members of mixed pairs maintained the metronome rate better in the other and full feedback conditions than in the no feedback condition, similar to musician pairs. Overall, nonmusicians benefited from musicians’ expertise without negatively influencing musicians’ ability to maintain the tapping rate. One implication is that nonmusicians may improve their beat-keeping abilities by performing tasks with musically skilled individuals.

 

Society for the Neurobiology of Language, Québec City, August 2018

This summer, our lab member Alex Emmendorfer presented the results of her first PhD experiment at the 2018 meeting of the Society of the Neurobiology of Language in Québec City, Canada.

In this EEG study, Alex used Dutch pseudowords varying in their phonotactic probability and syllable stress pattern to examine the processing of formal and temporal prediction in speech perception in an oddball paradigm. She showed that both formal and temporal predictability modulate processing of speech stimuli. High predictability in both formal and temporal domains facilitates processing compared to low predictability, indexed by greater MMN amplitudes for first syllable stress and shorter latencies for high phonotactic probability.

 

Additionally, we are excited to announce that Lab Director Sonja Kotz will be joining the society’s board as Chair-Elect for the upcoming year! Congratulations Sonja!

Schultz MIDI Benchmarking Toolbox now available!

Ever wanted to test the timing of your MIDI  percussion pads, sound modules, and instrument patches? Well now you can! The SMIDIBT is available and the scripts are free to download:

https://rdcu.be/LQJQ

If you have any comments or questions, feel free to contact Ben Schultz: ben.schultz@maastrichtuniversity.nl

 

Reference

Schultz, B. G. (2018). The Schultz MIDI Benchmarking Toolbox for MIDI interfaces, percussion pads, and sound cards, Behavior Research Methods.

Beat Gestures and Syntactic Parsing: An ERP Study

The second part of a project investigating the relationship between prosody and speaker’s gestures, in collaboration with Lauren Fromont (Montreal) and Salvador Soto-Faraco (Barcelona), has been published now. If you want to read the whole story, check this out:

Fromont L.A., Soto-Faraco S., and Biau E. (2017) Searching High and Low: Prosodic Breaks Disambiguate Relative Clauses. Front. Psychol. 8:96.

Biau, E., Fromont, L. A. and Soto-Faraco, S. (2017), Beat Gestures and Syntactic Parsing: An ERP Study. Language Learning. doi:10.1111/lang.12257

Blind people learn metrical and nonmetrical rhythms differently than the sighted

The article can be read here for free (for a limited time): http://rdcu.be/vTrT

When learning rhythms, sighted people tend to learn rhythms better when they induce a sense of beat (i.e., metrical rhythms) compared to when they don’t (i.e., nonmetrical rhythms). This experiment shows that blind people demonstrated the reverse trend; they learned nonmetrical rhythms better than metrical rhythms. These results suggest that the blind might be more sensitive to rhythms that are irregular, perhaps as a survival mechanism to detect changes in the environment that signal danger.

Breaking Research: How the vocal tract changes when speaking and singing

Benjamin Schultz, Joao Correia, and Michel Belyk have been examining changes in the vocal tract while speaking, whistling, and singing. Here, one of our singers aims to sing with as clear a tone as possible. Notice how the throat is nice and open:

 

Here, the same singer aims to mimic Louis Armstrong’s raspy vocal style. The throat pathway is considerably smaller and tense:

 

 

More videos will arrive soon – watch this space!

 

 

 

 

 

 

 

 

Searching high and low: Prosodic breaks disambiguate relative clauses

New published article:

In this paper, we investigated how some modulations in the speaker’s voice may impact the interpretation of ambiguous sentences on the listener’s side. We showed that the simple placement of prosodic breaks (i.e. short silences) at different key anchors in auditory sentences was enough to flip listeners’ preference toward an interpretation or the other. So be careful, silences speak your mind!

Fromont L.A., Soto-Faraco S., and Biau E. (2017) Searching High and Low: Prosodic Breaks Disambiguate Relative Clauses. Front. Psychol. 8:96. doi: 10.3389/fpsyg.2017.00096

Check this out: http://journal.frontiersin.org/article/10.3389/fpsyg.2017.00096/full