New paper alert! The authors found that a mismatch between sublexical and lexical affective values of words elicited an increased N400 response. Curious? Read about this new paper with contribution of Sonja Kotz here.
Category: Uncategorized
Uncovering hidden resting state dynamics: A new perspective on auditory verbal hallucinations
Find this new and interesting paper from our PhD student Hanna Honcamp here. She proposes that (semi) hidden markov models may relate to auditory verbal hallucinations to investigate temporal brain state dynamics.
Do you hear what I hear?
The pint of science festival takes place from 9-11 May. You will find interesting talks in different locations. If you have difficulties to decide to which event you want to go, we can recommend the first session on Monday 9th at The student Hotel Maastricht. During the session ‘Do you hear what I hear?’ Xan Duggirala, Pia Brinkmann and Jana Devos will present interesting facts about hearing, auditory hallucinations and tinnitus.
The link for the event is here.
The link to the session ‘Do you hear what I hear?’ is here.
Prediction in the aging brain: Merging cognitive, neurological, and evolutionary perspectives
A new paper written by former BAND lab member Rachel Brown and lab director Sonja Kotz investigates predictive mechanisms in the aging brain. It is hypothesized that when we age subcortical-cortical communication decreases, while default executive coupling increases. Curious to read more? Go here.
Identifying a brain network for musical rhythm: A functional neuroimaging meta-analysis and systematic review
Curious to read more about musical rhythm and how it engages a bilateral cortico-subcortical network that involves auditory and motor regions? Check out this new paper here.
Cortical thickness in default mode network hubs correlates with clinical features of dissociative seizures
Curious to learn how illness duration of dissociative seizures (DS), which are paroxysmal episodes of altered awareness and motor control that can resemble epilepsy, correlates with cortical thickness in hubs of the default mode network (DMN)? Check out the new paper here.
New paper: Overt Oculomotor Behavioral Reveals Covert Temporal Predictions
If you want to find out how patients with Parkinson’s Disease adapt their blinking behavior to tone sequences with different target probabilities? Read the new paper by Alessandro Tavano and Sonja Kotz!
EMOSEX – Emotion prevails over sex during implicit judgement of faces.
Do we associated anger more with a male face and happiness more with a female face? Does this association between emotion and gender also extend to the auditory domain (e.g., voices)?
Faces and voices are more likely to be judged as male when they are angry, and as female when they are happy, new research has revealed. The study found that how we understand the emotional expression of a face or voice is heavily influenced by perceived sex, and vice versa. He said: “This study shows how important it is not to rely too much on your first impressions, as they can easily be wrong. “Next time you find yourself attributing happiness or sadness to a woman be aware of your bias and possible misinterpretation.”
If these questions poke your interest, please read here and find our article here.
Left motor delta oscillations reflect asynchrony detection in multisensory speech perception (New paper by former BANDLAB members E. Biau & B. Schultz)
Read how Biau and colleagues manipulated audio-visual asynchrony detection and their results that read like this:
‘Results confirm (i) that participants accurately detected audio-visual asynchrony, and (ii) increased delta power in the left motor cortex in response to audio-visual asynchrony. The difference of delta power between asynchronous and synchronous conditions predicted behavioural performance, and (iii) decreased delta-beta coupling in the left motor cortex when listeners could not accurately map visual and auditory prosodies. Finally, both behavioural and neurophysiological evidence was altered when a speaker’s face was degraded by a visual mask.’
Those results suggest that asynchrony detection of audio-visual stimuli in speech is supported by left motor delta oscillations!
Read more here: https://doi.org/10.1523/JNEUROSCI.2965-20.2022