MetaTOC stay on top of your field, easily

Two Models of Minimalist, Incremental Syntactic Analysis

Topics in Cognitive Science

Published online on

Abstract

Minimalist grammars (MGs) and multiple context‐free grammars (MCFGs) are weakly equivalent in the sense that they define the same languages, a large mildly context‐sensitive class that properly includes context‐free languages. But in addition, for each MG, there is an MCFG which is strongly equivalent in the sense that it defines the same language with isomorphic derivations. However, the structure‐building rules of MGs but not MCFGs are defined in a way that generalizes across categories. Consequently, MGs can be exponentially more succinct than their MCFG equivalents, and this difference shows in parsing models too. An incremental, top‐down beam parser for MGs is defined here, sound and complete for all MGs, and hence also capable of parsing all MCFG languages. But since the parser represents its grammar transparently, the relative succinctness of MGs is again evident. Although the determinants of MG structure are narrowly and discretely defined, probabilistic influences from a much broader domain can influence even the earliest analytic steps, allowing frequency and context effects to come early and from almost anywhere, as expected in incremental models.