A Model of Language Processing as Hierarchic Sequential Prediction

Marten van Schijndel, Andy Exley, William Schuler

Research output: Contribution to journalArticlepeer-review

42 Scopus citations

Abstract

Computational models of memory are often expressed as hierarchic sequence models, but the hierarchies in these models are typically fairly shallow, reflecting the tendency for memories of superordinate sequence states to become increasingly conflated. This article describes a broad-coverage probabilistic sentence processing model that uses a variant of a left-corner parsing strategy to flatten sentence processing operations in parsing into a similarly shallow hierarchy of learned sequences. The main result of this article is that a broad-coverage model with constraints on hierarchy depth can process large newspaper corpora with the same accuracy as a state-of-the-art parser not defined in terms of sequential working memory operations.

Original languageEnglish (US)
Pages (from-to)522-540
Number of pages19
JournalTopics in Cognitive Science
Volume5
Issue number3
DOIs
StatePublished - Jul 2013

Keywords

  • Computational linguistics
  • Memory models
  • Parsing
  • Sequence models
  • Working memory

Fingerprint

Dive into the research topics of 'A Model of Language Processing as Hierarchic Sequential Prediction'. Together they form a unique fingerprint.

Cite this