Neurocomputational mechanisms contributing to auditory perception

Yale E. Cohen, Taku Banno, Jaejin Lee, Francisco Rodriguez-Campos, Matthew Schaff, Lalitta Suriya-Arunroj, Joji Tsunada

Research output: Contribution to journalArticlepeer-review

Abstract

A fundamental scientific question in auditory neuroscience is identifying the mechanisms required by the brain to transform an unlabelled mixture of auditory stimuli into distinct and coherent perceptual representations. This transformation is often called "auditory-scene analysis". Auditory-scene analysis consists of a complex interaction of multiple neurocomputational processes, including Gestalt grouping mechanisms, attention, and perceptual decision-making. Despite a great deal of scientific energy devoted to understanding these aspects of hearing, we still do not understand (1) how sound perception arises from neural activity and (2) the causal relationship between neural activity and sound perception. Several lines of evidence indicate that the ventral auditory pathway plays a prominent role in auditory perception and decision-making. Here, we review the contribution of the ventral pathway to auditory perception and put forth challenges to the field to further our understanding of the relationship between neural activity in the ventral pathway and perception.

Original languageEnglish (US)
Pages (from-to)870-873
Number of pages4
JournalActa Acustica united with Acustica
Volume104
Issue number5
DOIs
StatePublished - Sep 1 2018
Externally publishedYes

Bibliographical note

Funding Information:
This work was supported by grants from the NIH, ARO, and ONR.

Publisher Copyright:
© 2018 The Author(s).

Fingerprint

Dive into the research topics of 'Neurocomputational mechanisms contributing to auditory perception'. Together they form a unique fingerprint.

Cite this