Laboratoire Psychologie de la Perception Institut Neurosciences Cognition Université Paris Descartes Centre National de Recherche Scientifique
Home
Research
vision Vision
action Action
speech Speech
avoc AVoC
support support staff
People
Former Staff
Teaching
Publications
Ethics
Events
Practical

Calendar
Opportunities
Internships
Contracts
Platforms
Links

Baby Lab

Intranet
Decoding temporal structure in music and speech relies on shared brain resources but elicits different fine-scale spatial patterns

Music and speech are complex sound streams with hierarchical rules of temporal organization that become elaborated over time. Here, we use functional magnetic resonance imaging to measure brain activity patterns in 20 right-handed nonmusicians as they listened to natural and temporally reordered musical and speech stimuli matched for familiarity, emotion, and valence. Heart rate variability and mean respiration rates were simultaneously measured and were found not to differ between musical and speech stimuli. Although the same manipulation of temporal structure elicited brain activation level differences of similar magnitude for both music and speech stimuli, multivariate classification analysis revealed distinct spatial patterns of brain responses in the 2 domains. Distributed neuronal populations that included the inferior frontal cortex, the posterior and anterior superior and middle temporal gyri, and the auditory brainstem classified temporal structure manipulations in music and speech with significant levels of accuracy. While agreeing with previous findings that music and speech processing share neural substrates, this work shows that temporal structure in the 2 domains is encoded differently, highlighting a fundamental dissimilarity in how the same neural resources are deployed.