Information-Theoretic Neural Decoding Reproduces Several Laws of Human Behavior

S. Thomas Christie, Hayden R. Johnson, Paul R. Schrater

Research output: Contribution to journalArticlepeer-review

Abstract

Human response times conform to several regularities including the Hick-Hyman law, the power law of practice, speed-accuracy trade-offs, and the Stroop effect. Each of these has been thoroughly modeled in isolation, but no account describes these phenomena as predictions of a unified framework. We provide such a framework and show that the phenomena arise as decoding times in a simple neural rate code with an entropy stopping threshold. Whereas traditional information-theoretic encoding systems exploit task statistics to optimize encoding strategies, we move this optimization to the decoder, treating it as a Bayesian ideal observer that can track transmission statistics as prior information during decoding. Our approach allays prominent concerns that applying information-theoretic perspectives to modeling brain and behavior requires complex encoding schemes that are incommensurate with neural encoding.

Original languageEnglish (US)
Pages (from-to)675-690
Number of pages16
JournalOpen Mind
Volume7
DOIs
StatePublished - 2023

Bibliographical note

Publisher Copyright:
© 2023 Massachusetts Institute of Technology.

Keywords

  • Hick-Hyman law
  • information theory
  • power law of practice
  • rate coding
  • response times

PubMed: MeSH publication types

  • Journal Article

Fingerprint

Dive into the research topics of 'Information-Theoretic Neural Decoding Reproduces Several Laws of Human Behavior'. Together they form a unique fingerprint.

Cite this