The Role of Auditory and Visual Speech in Word Learning at 18 Months and in Adulthood
Published online on January 26, 2017
Abstract
Visual information influences speech perception in both infants and adults. It is still unknown whether lexical representations are multisensory. To address this question, we exposed 18‐month‐old infants (n = 32) and adults (n = 32) to new word–object pairings: Participants either heard the acoustic form of the words or saw the talking face in silence. They were then tested on recognition in the same or the other modality. Both 18‐month‐old infants and adults learned the lexical mappings when the words were presented auditorily and recognized the mapping at test when the word was presented in either modality, but only adults learned new words in a visual‐only presentation. These results suggest developmental changes in the sensory format of lexical representations.