Prefrontal cortex supports speech perception in listeners with cochlear implants

  1. Arefeh Sherafati
  2. Noel Dwyer
  3. Aahana Bajracharya
  4. Mahlega Samira Hassanpour
  5. Adam T Eggebrecht
  6. Jill B Firszt
  7. Joseph P Culver
  8. Jonathan Erik Peelle  Is a corresponding author
  1. Washington University in St. Louis, United States
  2. University of Utah, United States

Abstract

Cochlear implants are neuroprosthetic devices that can restore hearing in people with severe to profound hearing loss by electrically stimulating the auditory nerve. Because of physical limitations on the precision of this stimulation, the acoustic information delivered by a cochlear implant does not convey the same level of acoustic detail as that conveyed by normal hearing. As a result, speech understanding in listeners with cochlear implants is typically poorer and more effortful than in listeners with normal hearing. The brain networks supporting speech understanding in listeners with cochlear implants are not well understood, partly due to difficulties obtaining functional neuroimaging data in this population. In the current study, we assessed the brain regions supporting spoken word understanding in adult listeners with right unilateral cochlear implants (n=20) and matched controls (n=18) using high-density diffuse optical tomography (HD-DOT), a quiet and non-invasive imaging modality with spatial resolution comparable to that of functional MRI. We found that while listening to spoken words in quiet, listeners with cochlear implants showed greater activity in the left prefrontal cortex than listeners with normal hearing, specifically in a region engaged in a separate spatial working memory task. These results suggest that listeners with cochlear implants require greater cognitive processing during speech understanding than listeners with normal hearing, supported by compensatory recruitment of the left prefrontal cortex.

Data availability

Stimuli, data, and analysis scripts are available from https://osf.io/nkb5v/.

The following data sets were generated

Article and author information

Author details

  1. Arefeh Sherafati

    Department of Radiology, Washington University in St. Louis, St. Louis, United States
    Competing interests
    No competing interests declared.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-2543-0851
  2. Noel Dwyer

    Department of Otolaryngology, Washington University in St. Louis, St. Louis, United States
    Competing interests
    No competing interests declared.
  3. Aahana Bajracharya

    Department of Otolaryngology, Washington University in St. Louis, St. Louis, United States
    Competing interests
    No competing interests declared.
  4. Mahlega Samira Hassanpour

    moran Eye Center, University of Utah, Salt Lake City, United States
    Competing interests
    No competing interests declared.
  5. Adam T Eggebrecht

    Department of Radiology, Washington University in St. Louis, St. Louis, United States
    Competing interests
    No competing interests declared.
  6. Jill B Firszt

    Department of Otolaryngology, Washington University in St. Louis, St. Louis, United States
    Competing interests
    No competing interests declared.
  7. Joseph P Culver

    Department of Radiology, Washington University in St. Louis, St. Louis, United States
    Competing interests
    No competing interests declared.
  8. Jonathan Erik Peelle

    Department of Otolaryngology, Washington University in St. Louis, Saint Louis, United States
    For correspondence
    jpeelle@wustl.edu
    Competing interests
    Jonathan Erik Peelle, Reviewing editor, eLife.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-9194-854X

Funding

National Institutes of Health (R21DC015884)

  • Jonathan Erik Peelle

National Institutes of Health (R21DC016086)

  • Jonathan Erik Peelle

National Institutes of Health (K01MH103594)

  • Adam T Eggebrecht

National Institutes of Health (R21MH109775)

  • Adam T Eggebrecht

National Institutes of Health (R01NS090874)

  • Joseph P Culver

National Institutes of Health (R01NS109487)

  • Joseph P Culver

National Institutes of Health (R21DC015884)

  • Joseph P Culver

National Institutes of Health (R21DC016086)

  • Joseph P Culver

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Human subjects: All subjects were native speakers of English with no self-reported history of neurological or psychiatric disorders. All aspects of these studies were approved by the Human Research Protection Office (HRPO) of the Washington University School of Medicine. Subjects were recruited from the Washington University campus and the surrounding community (IRB 201101896, IRB 201709126). All subjects gave informed consent and were compensated for their participation in accordance with institutional and national guidelines.

Copyright

© 2022, Sherafati et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 1,690
    views
  • 353
    downloads
  • 16
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Arefeh Sherafati
  2. Noel Dwyer
  3. Aahana Bajracharya
  4. Mahlega Samira Hassanpour
  5. Adam T Eggebrecht
  6. Jill B Firszt
  7. Joseph P Culver
  8. Jonathan Erik Peelle
(2022)
Prefrontal cortex supports speech perception in listeners with cochlear implants
eLife 11:e75323.
https://doi.org/10.7554/eLife.75323

Share this article

https://doi.org/10.7554/eLife.75323

Further reading

    1. Neuroscience
    Geoffrey W Meissner, Allison Vannan ... FlyLight Project Team
    Research Article

    Techniques that enable precise manipulations of subsets of neurons in the fly central nervous system (CNS) have greatly facilitated our understanding of the neural basis of behavior. Split-GAL4 driver lines allow specific targeting of cell types in Drosophila melanogaster and other species. We describe here a collection of 3060 lines targeting a range of cell types in the adult Drosophila CNS and 1373 lines characterized in third-instar larvae. These tools enable functional, transcriptomic, and proteomic studies based on precise anatomical targeting. NeuronBridge and other search tools relate light microscopy images of these split-GAL4 lines to connectomes reconstructed from electron microscopy images. The collections are the result of screening over 77,000 split hemidriver combinations. Previously published and new lines are included, all validated for driver expression and curated for optimal cell-type specificity across diverse cell types. In addition to images and fly stocks for these well-characterized lines, we make available 300,000 new 3D images of other split-GAL4 lines.

    1. Neuroscience
    Hyun Jee Lee, Jingting Liang ... Hang Lu
    Research Advance

    Cell identification is an important yet difficult process in data analysis of biological images. Previously, we developed an automated cell identification method called CRF_ID and demonstrated its high performance in Caenorhabditis elegans whole-brain images (Chaudhary et al., 2021). However, because the method was optimized for whole-brain imaging, comparable performance could not be guaranteed for application in commonly used C. elegans multi-cell images that display a subpopulation of cells. Here, we present an advancement, CRF_ID 2.0, that expands the generalizability of the method to multi-cell imaging beyond whole-brain imaging. To illustrate the application of the advance, we show the characterization of CRF_ID 2.0 in multi-cell imaging and cell-specific gene expression analysis in C. elegans. This work demonstrates that high-accuracy automated cell annotation in multi-cell imaging can expedite cell identification and reduce its subjectivity in C. elegans and potentially other biological images of various origins.