Neural signatures of auditory hypersensitivity following acoustic trauma

Abstract

Neurons in sensory cortex exhibit a remarkable capacity to maintain stable firing rates despite large fluctuations in afferent activity levels. However, sudden peripheral deafferentation in adulthood can trigger an excessive, non-homeostatic cortical compensatory response that may underlie perceptual disorders including sensory hypersensitivity, phantom limb pain, and tinnitus. Here, we show that mice with noise-induced damage of the high-frequency cochlear base were behaviorally hypersensitive to spared mid-frequency tones and to direct optogenetic stimulation of auditory thalamocortical neurons. Chronic 2-photon calcium imaging from ACtx pyramidal neurons (PyrNs) revealed an initial stage of spatially diffuse hyperactivity, hyper-correlation, and auditory hyperresponsivity that consolidated around deafferented map regions three or more days after acoustic trauma. Deafferented PyrN ensembles also displayed hypersensitive decoding of spared mid-frequency tones that mirrored behavioral hypersensitivity, suggesting that non-homeostatic regulation of cortical sound intensity coding following sensorineural loss may be an underlying source of auditory hypersensitivity. Excess cortical response gain after acoustic trauma was expressed heterogeneously among individual PyrNs, yet 40% of this variability could be accounted for by each cell's baseline response properties prior to acoustic trauma. PyrNs with initially high spontaneous activity and gradual monotonic intensity growth functions were more likely to exhibit non-homeostatic excess gain after acoustic trauma. This suggests that while cortical gain changes are triggered by reduced bottom-up afferent input, their subsequent stabilization is also shaped by their local circuit milieu, where indicators of reduced inhibition can presage pathological hyperactivity following sensorineural hearing loss.

Data availability

All Figure code and data will be available on the Harvard Dataverse at the following:doi:10.7910/DVN/JLIKOZ

The following data sets were generated

Article and author information

Author details

  1. Matthew McGill

    Division of Medical Sciences, Harvard Medical School, Boston, United States
    For correspondence
    mmcgill@g.harvard.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-2322-9580
  2. Ariel E Hight

    Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
  3. Yurika L Watanabe

    Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
  4. Aravindakshan Parthasarathy

    Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
  5. Dongqin Cai

    Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
  6. Kameron Clayton

    Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
  7. Kenneth E Hancock

    Eaton Peabody Laboratory, Massachusetts Eye and Ear Infirmary, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
  8. Anne Takesian

    Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
  9. Sharon G Kujawa

    Department of Otolaryngology, Harvard Medical School, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
  10. Daniel B Polley

    Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-5120-2409

Funding

National Institute on Deafness and Other Communication Disorders (DC018974-02)

  • Matthew McGill

National Institute on Deafness and Other Communication Disorders (DC014871)

  • Ariel E Hight

Nancy Lurie Marks Family Foundation

  • Anne Takesian
  • Daniel B Polley

National Institute on Deafness and Other Communication Disorders (DC009836)

  • Daniel B Polley

National Institute on Deafness and Other Communication Disorders (DC015857)

  • Sharon G Kujawa
  • Daniel B Polley

National Institute on Deafness and Other Communication Disorders (DC018353)

  • Anne Takesian

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: All procedures were approved by the Massachusetts Eye and Ear Animal Care and Use Committee and followed the guidelines established by the National Institute of Health for the care and use of laboratory animals.

Human subjects: The study was approved by the human subjects Institutional Review Board at Mass General Brigham and Massachusetts Eye and Ear. Data analysis was performed on de-identified data, in accordance with the relevant guidelines and regulations.

Copyright

© 2022, McGill et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 2,259
    views
  • 352
    downloads
  • 27
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Matthew McGill
  2. Ariel E Hight
  3. Yurika L Watanabe
  4. Aravindakshan Parthasarathy
  5. Dongqin Cai
  6. Kameron Clayton
  7. Kenneth E Hancock
  8. Anne Takesian
  9. Sharon G Kujawa
  10. Daniel B Polley
(2022)
Neural signatures of auditory hypersensitivity following acoustic trauma
eLife 11:e80015.
https://doi.org/10.7554/eLife.80015

Share this article

https://doi.org/10.7554/eLife.80015

Further reading

    1. Neuroscience
    Sven Ohl, Martin Rolfs
    Research Article

    Detecting causal relations structures our perception of events in the world. Here, we determined for visual interactions whether generalized (i.e. feature-invariant) or specialized (i.e. feature-selective) visual routines underlie the perception of causality. To this end, we applied a visual adaptation protocol to assess the adaptability of specific features in classical launching events of simple geometric shapes. We asked observers to report whether they observed a launch or a pass in ambiguous test events (i.e. the overlap between two discs varied from trial to trial). After prolonged exposure to causal launch events (the adaptor) defined by a particular set of features (i.e. a particular motion direction, motion speed, or feature conjunction), observers were less likely to see causal launches in subsequent ambiguous test events than before adaptation. Crucially, adaptation was contingent on the causal impression in launches as demonstrated by a lack of adaptation in non-causal control events. We assessed whether this negative aftereffect transfers to test events with a new set of feature values that were not presented during adaptation. Processing in specialized (as opposed to generalized) visual routines predicts that the transfer of visual adaptation depends on the feature similarity of the adaptor and the test event. We show that the negative aftereffects do not transfer to unadapted launch directions but do transfer to launch events of different speeds. Finally, we used colored discs to assign distinct feature-based identities to the launching and the launched stimulus. We found that the adaptation transferred across colors if the test event had the same motion direction as the adaptor. In summary, visual adaptation allowed us to carve out a visual feature space underlying the perception of causality and revealed specialized visual routines that are tuned to a launch’s motion direction.

    1. Neuroscience
    Gergely F Turi, Sasa Teng ... Yueqing Peng
    Research Article

    Synchronous neuronal activity is organized into neuronal oscillations with various frequency and time domains across different brain areas and brain states. For example, hippocampal theta, gamma, and sharp wave oscillations are critical for memory formation and communication between hippocampal subareas and the cortex. In this study, we investigated the neuronal activity of the dentate gyrus (DG) with optical imaging tools during sleep-wake cycles in mice. We found that the activity of major glutamatergic cell populations in the DG is organized into infraslow oscillations (0.01–0.03 Hz) during NREM sleep. Although the DG is considered a sparsely active network during wakefulness, we found that 50% of granule cells and about 25% of mossy cells exhibit increased activity during NREM sleep, compared to that during wakefulness. Further experiments revealed that the infraslow oscillation in the DG was correlated with rhythmic serotonin release during sleep, which oscillates at the same frequency but in an opposite phase. Genetic manipulation of 5-HT receptors revealed that this neuromodulatory regulation is mediated by Htr1a receptors and the knockdown of these receptors leads to memory impairment. Together, our results provide novel mechanistic insights into how the 5-HT system can influence hippocampal activity patterns during sleep.