Deep learning based feature extraction for prediction and interpretation of sharp-wave ripples in the rodent hippocampus

  1. Andrea Navas-Olive
  2. Rodrigo Amaducci
  3. Maria-Teresa Jurado-Parras
  4. Enrique R Sebastian
  5. Liset M de la Prida  Is a corresponding author
  1. Instituto Cajal, Spain
  2. Universidad Autónoma de Madrid, Spain

Abstract

Local field potential (LFP) deflections and oscillations define hippocampal sharp-wave ripples (SWR), one of the most synchronous events of the brain. SWR reflect firing and synaptic current sequences emerging from cognitively relevant neuronal ensembles. While spectral analysis have permitted advances, the surge of ultra-dense recordings now call for new automatic detection strategies. Here, we show how one-dimensional convolutional networks operating over high-density LFP hippocampal recordings allowed for automatic identification of SWR from the rodent hippocampus. When applied without retraining to new datasets and ultra-dense hippocampus-wide recordings, we discovered physiologically relevant processes associated to the emergence of SWR, prompting for novel classification criteria. To gain interpretability, we developed a method to interrogate the operation of the artificial network. We found it relied in feature-based specialization, which permit identification of spatially segregated oscillations and deflections, as well as synchronous population firing typical of replay. Thus, using deep learning based approaches may change the current heuristic for a better mechanistic interpretation of these relevant neurophysiological events.

Data availability

Data is deposited in the Figshare repository https://figshare.com/projects/cnn-ripple-data/117897. The trained model is accessible at the Github repository for both Python: https://github.com/PridaLab/cnn-ripple, and Matlab: https://github.com/PridaLab/cnn-matlab Code visualization and detection is shown in an interactive notebook https://colab.research.google.com/github/PridaLab/cnn-ripple/blob/main/src/notebooks/cnn-example.ipynb . The online detection Open Ephys plugin is accessible at the Github repository: https://github.com/PridaLab/CNNRippleDetectorOEPlugin

Article and author information

Author details

  1. Andrea Navas-Olive

    Functional and Systems Neuroscience, Instituto Cajal, Madrid, Spain
    Competing interests
    No competing interests declared.
  2. Rodrigo Amaducci

    Grupo de Neurocomputación Biológica, Universidad Autónoma de Madrid, Madrid, Spain
    Competing interests
    No competing interests declared.
  3. Maria-Teresa Jurado-Parras

    Functional and Systems Neuroscience, Instituto Cajal, Madrid, Spain
    Competing interests
    No competing interests declared.
  4. Enrique R Sebastian

    Functional and Systems Neuroscience, Instituto Cajal, Madrid, Spain
    Competing interests
    No competing interests declared.
  5. Liset M de la Prida

    Functional and Systems Neuroscience, Instituto Cajal, Madrid, Spain
    For correspondence
    lmprida@cajal.csic.es
    Competing interests
    Liset M de la Prida, Reviewing editor, eLife.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-0160-6472

Funding

Fundacion La Caixa (LCF/PR/HR21/52410030)

  • Liset M de la Prida

Ministerio de Educacion (FPU17/03268)

  • Andrea Navas-Olive

Universidad Autónoma de Madrid (FPI-UAM-2017)

  • Rodrigo Amaducci

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: All protocols and procedures were performed according to the Spanish legislation (R.D. 1201/2005 and L.32/2007) and the European Communities Council Directive 2003 (2003/65/CE). Experiments and procedures were approved by the Ethics Committee of the Instituto Cajal and the Spanish Research Council (PROEX131-16 and PROEX161-19). All surgical procedures were performed under isoflurane anesthesia and every effort was made to minimize suffering.

Copyright

© 2022, Navas-Olive et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 3,838
    views
  • 605
    downloads
  • 28
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Andrea Navas-Olive
  2. Rodrigo Amaducci
  3. Maria-Teresa Jurado-Parras
  4. Enrique R Sebastian
  5. Liset M de la Prida
(2022)
Deep learning based feature extraction for prediction and interpretation of sharp-wave ripples in the rodent hippocampus
eLife 11:e77772.
https://doi.org/10.7554/eLife.77772

Share this article

https://doi.org/10.7554/eLife.77772

Further reading

    1. Neuroscience
    Gergely F Turi, Sasa Teng ... Yueqing Peng
    Research Article

    Synchronous neuronal activity is organized into neuronal oscillations with various frequency and time domains across different brain areas and brain states. For example, hippocampal theta, gamma, and sharp wave oscillations are critical for memory formation and communication between hippocampal subareas and the cortex. In this study, we investigated the neuronal activity of the dentate gyrus (DG) with optical imaging tools during sleep-wake cycles in mice. We found that the activity of major glutamatergic cell populations in the DG is organized into infraslow oscillations (0.01–0.03 Hz) during NREM sleep. Although the DG is considered a sparsely active network during wakefulness, we found that 50% of granule cells and about 25% of mossy cells exhibit increased activity during NREM sleep, compared to that during wakefulness. Further experiments revealed that the infraslow oscillation in the DG was correlated with rhythmic serotonin release during sleep, which oscillates at the same frequency but in an opposite phase. Genetic manipulation of 5-HT receptors revealed that this neuromodulatory regulation is mediated by Htr1a receptors and the knockdown of these receptors leads to memory impairment. Together, our results provide novel mechanistic insights into how the 5-HT system can influence hippocampal activity patterns during sleep.

    1. Neuroscience
    Sven Ohl, Martin Rolfs
    Research Article

    Detecting causal relations structures our perception of events in the world. Here, we determined for visual interactions whether generalized (i.e. feature-invariant) or specialized (i.e. feature-selective) visual routines underlie the perception of causality. To this end, we applied a visual adaptation protocol to assess the adaptability of specific features in classical launching events of simple geometric shapes. We asked observers to report whether they observed a launch or a pass in ambiguous test events (i.e. the overlap between two discs varied from trial to trial). After prolonged exposure to causal launch events (the adaptor) defined by a particular set of features (i.e. a particular motion direction, motion speed, or feature conjunction), observers were less likely to see causal launches in subsequent ambiguous test events than before adaptation. Crucially, adaptation was contingent on the causal impression in launches as demonstrated by a lack of adaptation in non-causal control events. We assessed whether this negative aftereffect transfers to test events with a new set of feature values that were not presented during adaptation. Processing in specialized (as opposed to generalized) visual routines predicts that the transfer of visual adaptation depends on the feature similarity of the adaptor and the test event. We show that the negative aftereffects do not transfer to unadapted launch directions but do transfer to launch events of different speeds. Finally, we used colored discs to assign distinct feature-based identities to the launching and the launched stimulus. We found that the adaptation transferred across colors if the test event had the same motion direction as the adaptor. In summary, visual adaptation allowed us to carve out a visual feature space underlying the perception of causality and revealed specialized visual routines that are tuned to a launch’s motion direction.