Laboratoire Psychologie de la Perception Institut Neurosciences Cognition Université Paris Descartes Centre National de Recherche Scientifique
Home
Research
vision Vision
action Action
speech Speech
avoc AVoC
support support staff
People
Former Staff
Teaching
Publications
Ethics
Events
Practical

Calendar
Opportunities
Internships
Contracts
Platforms
Links

Baby Lab

Intranet
A psychophysical imaging method evidencing auditory cues extraction during speech perception: A group analysis of auditory classification images

Although there is a large consensus regarding the involvement of specific acoustic cues in speech perception, the precise mechanisms underlying the transformation from continuous acoustical properties into discrete perceptual units remains undetermined. This gap in knowledge is partially due to the lack of a turnkey solution for isolating critical speech cues from natural stimuli. In this paper, we describe a psychoacoustic imaging method known as the Auditory Classification Image technique that allows experimenters to estimate the relative importance of time-frequency regions in categorizing natural speech utterances in noise. Importantly, this technique enables the testing of hypotheses on the listening strategies of participants at the group level. We exemplify this approach by identifying the acoustic cues involved in da/ga categorization with two phonetic contexts, Al- or Ar-. The application of Auditory Classification Images to our group of 16 participants revealed significant critical regions on the F2 and F3 onsets, as predicted by the literature, as well as an unexpected temporal cue on F1. Finally, through a cluster-based nonparametric test, we demonstrate that this method is sufficiently sensitive to detect fine modifications of the classification strategies between different utterances of the same phonemes.