MetaTOC stay on top of your field, easily

All Together Now: Concurrent Learning of Multiple Structures in an Artificial Language

,

Cognitive Science / Cognitive Sciences

Published online on

Abstract

Natural languages contain many layers of sequential structure, from the distribution of phonemes within words to the distribution of phrases within utterances. However, most research modeling language acquisition using artificial languages has focused on only one type of distributional structure at a time. In two experiments, we investigated adult learning of an artificial language that contains dependencies between both adjacent and non‐adjacent words. We found that learners rapidly acquired both types of regularities and that the strength of the adjacent statistics influenced learning of both adjacent and non‐adjacent dependencies. Additionally, though accuracy was similar for both types of structure, participants’ knowledge of the deterministic non‐adjacent dependencies was more explicit than their knowledge of the probabilistic adjacent dependencies. The results are discussed in the context of current theories of statistical learning and language acquisition.