Cortex-wide neural interfacing via transparent polymer skulls

Leila Ghanbari, Russell E. Carter, Matthew L. Rynes, Judith Dominguez, Gang Chen, Anant Naik, Jia Hu, Md Abdul Kader Sagar, Lenora Haltom, Nahom Mossazghi, Madelyn M. Gray, Sarah L. West, Kevin W. Eliceiri, Timothy J. Ebner, Suhasa B. Kodandaramaiah

Research output: Contribution to journalArticlepeer-review


Neural computations occurring simultaneously in multiple cerebral cortical regions are critical for mediating cognition, perception and sensorimotor behaviors. Enormous progress has been made in understanding how neural activity in specific cortical regions contributes to behavior. However, there is a lack of tools that allow simultaneous monitoring and perturbing neural activity from multiple cortical regions. To fill this need, we have engineered “See-Shells” – digitally designed, morphologically realistic, transparent polymer skulls that allow long-term (>200 days) optical access to 45 mm2 of the dorsal cerebral cortex in the mouse. We demonstrate the ability to perform mesoscopic imaging, as well as cellular and subcellular resolution two-photon imaging of neural structures up to 600 µm through the See-Shells. See-Shells implanted on transgenic mice expressing genetically encoded calcium (Ca2+) indicators allow tracking of neural activities from multiple, non-contiguous regions spread across millimeters of the cortex. Further, neural probes can access the brain through perforated See-Shells, either for perturbing or recording neural activity from localized brain regions simultaneously with whole cortex imaging. As See-Shells can be constructed using readily available desktop fabrication tools and modified to fit a range of skull geometries, they provide a powerful tool for investigating brain structure and function.

Original languageEnglish (US)
JournalUnknown Journal
StatePublished - Aug 7 2018

Fingerprint Dive into the research topics of 'Cortex-wide neural interfacing via transparent polymer skulls'. Together they form a unique fingerprint.

Cite this