Veronica Vuong

Graduate Student
Univ of Toronto
Email author

Neural Correlates of Autobiographically Salient Music Listening in Healthy Older Adults

Veronica Vuong, Michael Thaut, Claude Alain

Veronica is a PhD candidate in Medical Science at the University of Toronto. As a pianist and an alumna of the Faculty of Music, University of Toronto, she previously completed her MA in Music and Health Sciences and BMus in Music Education with Honours.

Currently, she conducts her research at Baycrest Health Sciences where she studies the neural correlates of musical memory in older adults with mild cognitive impairment using electroencephalography (EEG). Veronica is also the Research Coordinator at the Music and Health Science Research Collaboratory, Faculty of Music, University of Toronto. 

Neural Correlates of Autobiographically Salient Music Listening in Healthy Older Adults

Veronica Vuong, Michael Thaut, Claude Alain
Abstract

Music, such as melodies, are often retained in long-term memory. However, little is known about the time course of recollection processes for recognizing familiar (FAM) or autobiographically salient (ABS) music, the latter, distinguished by its link to one’s personal past (i.e., people, locations, events). Eighteen older adults (mean age = 69 yrs, aged 61 to 79, 8 males) took part in two studies.

In Study 1, we measured reaction time (RT) during a listening task of ABS, FAM and unfamiliar (UFAM) music in older adults. Older adults were most accurate in recognizing ABS music (92%) and least accurate in recognizing FAM music (84%). Importantly, participants were fastest in identifying ABS music (M = 2.38 s, sd = 0.05 s), intermediate for FAM (M = 3.05 s, sd = 0.09 s), and slowest for UFAM music (M = 3.47 s, sd = 1.10 s). The results from Study 1 suggest that older adults recognize ABS quickly, i.e., within 2.4 s of listening. ABS music holds an advantage in eliciting rapid and accurate behavioural responses.

In Study 2, we measured high-density scalp recording of event-related potentials (ERPs) while participants were presented with five second clips of ABS, FAM, and UFAM music. Participants were asked to make a judgment at the end of each music segment to minimize contamination of response-related processes with recollection processes. All three music conditions generated transient evoked responses at the onset of the music piece, including the N1 and P2 waves. In addition, we observed a sustained potential that lasted several seconds. Clustered-based statistics was used to identify differences in ERP amplitude. Significant differences between the three conditions were found over the left scalp parietal area, where the waveform showed greater positivity in the ABS condition from 627-1218 ms (peak latency = 753 ms) compared to FAM and UFAM music. The modulation over the left parietal scalp areas is analogous with the Late Positive Complex (LPC), which has previously been reported in auditory recognition memory tasks over centro-parietal electrodes. It may reflect episodic memory recollection processes. We also observed differences over the right fronto-central scalp area, with greater positivity in the ABS condition relative to FAM and UFAM music from 2410-3503 ms (peak latency = 3171 ms). The latter could index activity from reward, memory and speech areas associated with reminiscence. Together the behavioral and electrophysiological findings indicate that the time course of recollection for ABS music is distinguished from FAM music. The results are consistent with an early retrieval process, which may be followed by integration of memories and associations, reflections, or emotional processing, resulting in extended cognitive engagement. The study results can inform methodology regarding length of music stimuli, particularly for temporal-based techniques.