Hierarchical temporal prediction captures motion processing along the visual pathway

  1. Yosef Singer  Is a corresponding author
  2. Luke CL Taylor
  3. Ben DB Willmore
  4. Andrew J King  Is a corresponding author
  5. Nicol S Harper  Is a corresponding author
  1. University of Oxford, United Kingdom

Abstract

Visual neurons respond selectively to features that become increasingly complex from the eyes to the cortex. Retinal neurons prefer flashing spots of light, primary visual cortical (V1) neurons prefer moving bars, and those in higher cortical areas favor complex features like moving textures. Previously, we showed that V1 simple cell tuning can be accounted for by a basic model implementing temporal prediction - representing features that predict future sensory input from past input (Singer et al., 2018). Here we show that hierarchical application of temporal prediction can capture how tuning properties change across at least two levels of the visual system. This suggests that the brain does not efficiently represent all incoming information; instead, it selectively represents sensory inputs that help in predicting the future. When applied hierarchically, temporal prediction extracts time-varying features that depend on increasingly high-level statistics of the sensory input.

Data availability

All custom code used in this study was implemented in Python. The code for the models and analyses shown in Figures 1-8 and associated sections can be found at https://bitbucket.org/ox-ang/hierarchical_temporal_prediction/src/master/. The V1 neural response data (Ringach et al., 2002) used for comparison with the temporal prediction model in Figure 6 came from http://ringachlab.net/ ("Data & Code", "Orientation tuning in Macaque V1"). The V1 image response data used to test the models included in Figure 9 were downloaded with permission from https://github.com/sacadena/Cadena2019PlosCB (Cadena et al., 2019). The V1 movie response data used to test these models were collected in the Laboratory of Dario Ringach at UCLA and downloaded from https://crcns.org/data-sets/vc/pvc-1 (Nahaus and Ringach, 2007; Ringach and Nahaus, 2009). The code for the models and analyses shown in Figure 9 and the associated section can be found at https://github.com/webstorms/StackTP and https://github.com/webstorms/NeuralPred. The movies used for training the models in Figure 9 are available at https://figshare.com/articles/dataset/Natural_movies/24265498.

Article and author information

Author details

  1. Yosef Singer

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    For correspondence
    yosef.singer@stcatz.ox.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-4480-0574
  2. Luke CL Taylor

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
  3. Ben DB Willmore

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-2969-7572
  4. Andrew J King

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    For correspondence
    andrew.king@dpag.ox.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-5180-7179
  5. Nicol S Harper

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    For correspondence
    nicol.harper@dpag.ox.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-7851-4840

Funding

Wellcome Trust (WT108369/Z/2015/Z)

  • Ben DB Willmore
  • Andrew J King
  • Nicol S Harper

University of Oxford Clarendon Fund

  • Yosef Singer
  • Luke CL Taylor

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Copyright

© 2023, Singer et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Yosef Singer
  2. Luke CL Taylor
  3. Ben DB Willmore
  4. Andrew J King
  5. Nicol S Harper
(2023)
Hierarchical temporal prediction captures motion processing along the visual pathway
eLife 12:e52599.
https://doi.org/10.7554/eLife.52599

Share this article

https://doi.org/10.7554/eLife.52599

Further reading

    1. Neuroscience
    Cameron T Ellis, Tristan S Yates ... Nicholas Turk-Browne
    Research Article

    Studying infant minds with movies is a promising way to increase engagement relative to traditional tasks. However, the spatial specificity and functional significance of movie-evoked activity in infants remains unclear. Here, we investigated what movies can reveal about the organization of the infant visual system. We collected fMRI data from 15 awake infants and toddlers aged 5–23 months who attentively watched a movie. The activity evoked by the movie reflected the functional profile of visual areas. Namely, homotopic areas from the two hemispheres responded similarly to the movie, whereas distinct areas responded dissimilarly, especially across dorsal and ventral visual cortex. Moreover, visual maps that typically require time-intensive and complicated retinotopic mapping could be predicted, albeit imprecisely, from movie-evoked activity in both data-driven analyses (i.e. independent component analysis) at the individual level and by using functional alignment into a common low-dimensional embedding to generalize across participants. These results suggest that the infant visual system is already structured to process dynamic, naturalistic information and that fine-grained cortical organization can be discovered from movie data.

    1. Neuroscience
    Diellor Basha, Amirmohammad Azarmehri ... Igor Timofeev
    Research Article

    Memory consolidation during sleep depends on the interregional coupling of slow waves, spindles, and sharp wave-ripples (SWRs), across the cortex, thalamus, and hippocampus. The reuniens nucleus of the thalamus, linking the medial prefrontal cortex (mPFC) and the hippocampus, may facilitate interregional coupling during sleep. To test this hypothesis, we used intracellular, extracellular unit and local field potential recordings in anesthetized and head restrained non-anesthetized cats as well as computational modelling. Electrical stimulation of the reuniens evoked both antidromic and orthodromic intracellular mPFC responses, consistent with bidirectional functional connectivity between mPFC, reuniens and hippocampus in anesthetized state. The major finding obtained from behaving animals is that at least during NREM sleep hippocampo-reuniens-mPFC form a functional loop. SWRs facilitate the triggering of thalamic spindles, which later reach neocortex. In return, transition to mPFC UP states increase the probability of hippocampal SWRs and later modulate spindle amplitude. During REM sleep hippocampal theta activity provides periodic locking of reuniens neuronal firing and strong crosscorrelation at LFP level, but the values of reuniens-mPFC crosscorrelation was relatively low and theta power at mPFC was low. The neural mass model of this network demonstrates that the strength of bidirectional hippocampo-thalamic connections determines the coupling of oscillations, suggesting a mechanistic link between synaptic weights and the propensity for interregional synchrony. Our results demonstrate the presence of functional connectivity in hippocampo-thalamo-cortical network, but the efficacy of this connectivity is modulated by behavioral state.