Laboratoire Psychologie de la Perception Institut Neurosciences Cognition Université Paris Descartes Centre National de Recherche Scientifique
Home
Research
vision Vision
action Action
speech Speech
avoc AVoC
support support staff
People
Former Staff
Teaching
Publications
Ethics
Events
Practical

Calendar
Opportunities
Internships
Contracts
Platforms
Links

Baby Lab

Intranet
Developmental change in infants' detection of visual faces that match auditory vowels

Infants demonstrate robust audiovisual (AV) perception, detecting, for example, which visual face matches auditory speech in many paradigms. For simple phonetic segments, like vowels, previous work has assumed developmental stability in AV matching. This study shows dramatic differences in matching performance for different vowels across the first year of life: 3-, 6-, and 9-month-olds were familiarized for 40 sec with a visual face articulating a vowel in synchrony with auditory presentations of that vowel, but crucially, the mouth of the face was occluded. At test, infants were shown two still photos of the same face without occlusion for 1 min in silence. One face had a static articulatory configuration matching the previously heard vowel, while the other face had a static configuration matching a different vowel. Three auditory vowels were used: /a/, /i/, and /u/. Results suggest that AV matching performance varies according to age and to the familiarized vowel. Interestingly, results are not linked to the frequency of vowels in auditory input, but may instead be related to infants’ ability to produce the target vowel. A speculative hypothesis is that vowel production in infancy modulates AV vowel matching.



PDF Link