Human generalization of internal representations through prototype learning with goal-directed attention

Warren Woodrich Pettine, Dhruva Venkita Raman, A. David Redish, John D. Murray

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

The world is overabundant with feature-rich information obscuring the latent causes of experience. How do people approximate the complexities of the external world with simplified internal representations that generalize to novel examples or situations? Theories suggest that internal representations could be determined by decision boundaries that discriminate between alternatives, or by distance measurements against prototypes and individual exemplars. Each provide advantages and drawbacks for generalization. We therefore developed theoretical models that leverage both discriminative and distance components to form internal representations via action-reward feedback. We then developed three latent-state learning tasks to test how humans use goal-oriented discrimination attention and prototypes/exemplar representations. The majority of participants attended to both goal-relevant discriminative features and the covariance of features within a prototype. A minority of participants relied only on the discriminative feature. Behaviour of all participants could be captured by parameterizing a model combining prototype representations with goal-oriented discriminative attention.

Original languageEnglish (US)
Pages (from-to)442-463
Number of pages22
JournalNature Human Behaviour
Volume7
Issue number3
DOIs
StatePublished - Mar 2023

Bibliographical note

Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer Nature Limited.

PubMed: MeSH publication types

  • Journal Article
  • Research Support, Non-U.S. Gov't
  • Research Support, N.I.H., Extramural

Fingerprint

Dive into the research topics of 'Human generalization of internal representations through prototype learning with goal-directed attention'. Together they form a unique fingerprint.

Cite this