Preschoolers attribution of affect to music: A comparison between vocal and instrumental performance
Published online on August 01, 2016
Abstract
Research has shown inconsistent results concerning the ability of young children to identify musical emotion. This study explores the influence of the type of musical performance (vocal vs. instrumental) on children’s affect identification. Using an independent-group design, novel child-directed music was presented in three conditions: instrumental, vocal-only, and song (instrumental plus vocals) to 3- to 6-year-olds previously screened for language development (N = 76). A forced-choice task was used in which children chose a face expressing the emotion matching each musical track. All performance conditions comprised "happy" (major mode/fast tempo) and "sad" (minor mode/slow tempo) tracks. Nonsense syllables rather than words were used in the vocals in order to avoid the influence of lyrics on children’s decisions. The results showed that even the younger children were able to correctly identify the intended emotion in music, although "happy" music was more readily recognised and recognition appeared facilitated in the instrumental condition. Performance condition interacted with gender.