MetaTOC stay on top of your field, easily

A Role for Chunk Formation in Statistical Learning of Second Language Syntax

Language Learning / Language and Learning

Published online on

Abstract

Humans are remarkably sensitive to the statistical structure of language. However, different mechanisms have been proposed to account for such statistical sensitivities. The present study compared adult learning of syntax and the ability of two models of statistical learning to simulate human performance: Simple Recurrent Networks, which learn by predictive computation, and PARSER, which learns chunks as a byproduct of general principles of associative learning and memory. In the first stage, a semiartificial language paradigm was used to gather human data. In the second stage, a simulation paradigm was then used to compare the patterns of performance of the SRN and PARSER. After the human adults and the computational models were trained on sentences from the semiartificial language with probabilistic syntax, their learning outcomes were compared. Neither model was able to fully reproduce the human data, which may indicate less robust statistical learning effects in adults; however, PARSER was able to simulate more of the adult learning data than the SRN, suggesting a possible role for chunk formation in early phases of adult learning of second language syntax.