Abstract
Human response times conform to several regularities including the Hick-Hyman law, the power law of practice, speed-accuracy trade-offs, and the Stroop effect. Each of these has been thoroughly modeled in isolation, but no account describes these phenomena as predictions of a unified framework. We provide such a framework and show that the phenomena arise as decoding times in a simple neural rate code with an entropy stopping threshold. Whereas traditional information-theoretic encoding systems exploit task statistics to optimize encoding strategies, we move this optimization to the decoder, treating it as a Bayesian ideal observer that can track transmission statistics as prior information during decoding. Our approach allays prominent concerns that applying information-theoretic perspectives to modeling brain and behavior requires complex encoding schemes that are incommensurate with neural encoding.
Original language | English (US) |
---|---|
Pages (from-to) | 675-690 |
Number of pages | 16 |
Journal | Open Mind |
Volume | 7 |
DOIs | |
State | Published - 2023 |
Bibliographical note
Publisher Copyright:© 2023 Massachusetts Institute of Technology.
Keywords
- Hick-Hyman law
- information theory
- power law of practice
- rate coding
- response times
PubMed: MeSH publication types
- Journal Article