An Accumulating Neural Signal Underlying Binocular Rivalry Dynamics

Shaozhi Nie, Sucharit Katyal, Stephen A. Engel

Research output: Contribution to journalArticlepeer-review

Abstract

During binocular rivalry, conflicting images are presented one to each eye and perception alternates stochastically between them. Despite stable percepts between alternations, modeling suggests that neural signals representing the two images change gradually, and that the duration of stable percepts are determined by the time required for these signals to reach a threshold that triggers an alternation. However, direct physiological evidence for such signals has been lacking. Here, we identify a neural signal in the human visual cortex that shows these predicted properties. We measured steady-state visual evoked potentials (SSVEP) in 84 human participants (62 females, 22 males) who were presented with orthogonal gratings, one to each eye, flickering at different frequencies. Participants indicated their percept while EEG data were collected. The time courses of the SSVEP amplitudes at the two frequencies were then compared across different percept durations, within participants. For all durations, the amplitude of signals corresponding to the suppressed stimulus increased and the amplitude corresponding to the dominant stimulus decreased throughout the percept. Critically, longer percepts were characterized by more gradual increases in the suppressed signal and more gradual decreases of the dominant signal. Changes in signals were similar and rapid at the end of all percepts, presumably reflecting perceptual transitions. These features of the SSVEP time courses are well predicted by a model in which perceptual transitions are produced by the accumulation of noisy signals. Identification of this signal underlying binocular rivalry should allow strong tests of neural models of rivalry, bistable perception, and neural suppression.

Original languageEnglish (US)
JournalJournal of Neuroscience
Volume43
Issue number50
DOIs
StatePublished - Dec 13 2023

Bibliographical note

Publisher Copyright:
Copyright © 2023 the authors.

PubMed: MeSH publication types

  • Journal Article

Cite this