Christina Vanden Bosch der Nederlanden

Postdoc
Western
Email author

Infants neurally track the rhythms of speech and song

Christina Vanden Bosch der Nederlanden, Marc Joanisse, Laurel Trainor, & Jessica Grahn

 

Hello! My name is Christina Vanden Bosch der Nederlanden and I'm a postdoc at Western, transitioning to an Assistant Professor position at the University of Toronto - Mississauga this January (2022). 

The poster I'm presenting today looks at how infants track the syllable rhythms of infant-directed and monotone spoken and sung utterances using EEG. Four-month-olds neurally tracked all stimuli above chance, and actually had greater coherence to syllable rhythms overall compared to the adults. We didn't find any differences in neural tracking between infant-directed and monotone utterances, but we did find that adults and infants had greater neural tracking (delta-theta bands) to song than speech in the monotone condition. Adults also showed greater coherence to speech than song for monotone utterances, in the theta-alpha band. 

Our results provide good evidence for neural tracking to low-frequency information of speech and song in infancy and characterizes differences in neural entrainment to song and speech for monotone utterances.

Join me via zoom to discuss our findings.

If you're an undergraduate looking to apply for graduate programs for the Fall of 2022, please get in touch if you're interested in music, language, and the brain! 
Check out my website to learn more about my research program.

 

Infants neurally track the rhythms of speech and song

Christina Vanden Bosch der Nederlanden, Marc Joanisse, Laurel Trainor, & Jessica Grahn
Abstract

The musical structure of songs may help listeners neurally track syllable onsets (VBdN et al., 2020), which could underlie better word learning from music than speech (Ma et al., 2021). The musical features of infant-directed (ID) utterances similarly show word learning benefits (Thiessen et al., 2005), suggesting similar boosts in neural tracking for ID compared to monotone utterances.

Our study used cerebro-acoustic phase coherence as an index of neural syllable-rhythm tracking, to determine whether infants and adults are better at tracking ID vs. monotone spoken and sung utterances.