A Model of Language Processing as Hierarchic Sequential Prediction
Published online on June 13, 2013
Abstract
Computational models of memory are often expressed as hierarchic sequence models, but the hierarchies in these models are typically fairly shallow, reflecting the tendency for memories of superordinate sequence states to become increasingly conflated. This article describes a broad‐coverage probabilistic sentence processing model that uses a variant of a left‐corner parsing strategy to flatten sentence processing operations in parsing into a similarly shallow hierarchy of learned sequences. The main result of this article is that a broad‐coverage model with constraints on hierarchy depth can process large newspaper corpora with the same accuracy as a state‐of‐the‐art parser not defined in terms of sequential working memory operations.