Modality-independent coding of scene categories in prefrontal cortex

Yaelan Jung, Bart Larsen, Dirk B. Walther

Research output: Contribution to journalArticlepeer-review

20 Scopus citations

Abstract

Natural environments convey information through multiple sensory modalities, all of which contribute to people’s percepts. Although it has been shown that visual or auditory content of scene categories can be decoded from brain activity, it remains unclear how humans represent scene information beyond a specific sensory modality domain. To address this question, we investigated how categories of scene images and sounds are represented in several brain regions. A group of healthy human subjects (both sexes) participated in the present study, where their brain activity was measured with fMRI while viewing images or listening to sounds of different real-world environments. We found that both visual and auditory scene categories can be decoded not only from modality-specific areas, but also from several brain regions in the temporal, parietal, and prefrontal cortex (PFC). Intriguingly, only in the PFC, but not in any other regions, categories of scene images and sounds appear to be represented in similar activation patterns, suggesting that scene representations in PFC are modality-independent. Furthermore, the error patterns of neural decoders indicate that category-specific neural activity patterns in the middle and superior frontal gyri are tightly linked to categorization behavior. Our findings demonstrate that complex scene information is represented at an abstract level in the PFC, regardless of the sensory modality of the stimulus.

Original languageEnglish (US)
Pages (from-to)5969-5981
Number of pages13
JournalJournal of Neuroscience
Volume38
Issue number26
DOIs
StatePublished - Jun 27 2018
Externally publishedYes

Bibliographical note

Funding Information:
Received Jan. 26, 2018; revised May 3, 2018; accepted May 26, 2018. Author contributions: Y.J. wrote the first draft of the paper; Y.J. and D.B.W. edited the paper; D.B.W. designed research;Y.J.,B.L.,andD.B.W.performedresearch;Y.J.andD.B.W.analyzeddata;Y.J.andD.B.W.wrotethepaper. This work was supported by Natural Sciences and Engineering Research Council Discovery Grant 498390 and Canadian Foundation for Innovation 32896. We thank Michael Mack and Heeyoung Choo for helpful comments on an earlier version of this manuscript. The authors declare no competing financial interests. Correspondence should be addressed to Dr. Yaelan Jung, 100 St. George Street, Toronto, Ontario M5S 3G3, Canada. E-mail: yaelan.jung@mail.utoronto.ca. DOI:10.1523/JNEUROSCI.0272-18.2018 Copyright © 2018 the authors 0270-6474/18/385969-13$15.00/0

Publisher Copyright:
© 2018 the authors.

Keywords

  • Cross-modal
  • FMRI
  • Modality-independent representation
  • Multivoxel pattern analysis
  • PFC
  • Scene perception

Fingerprint

Dive into the research topics of 'Modality-independent coding of scene categories in prefrontal cortex'. Together they form a unique fingerprint.

Cite this