Flexible and efficient simulation-based inference for models of decision-making

  1. Jan Boelts  Is a corresponding author
  2. Jan-Matthis Lueckmann
  3. Richard Gao
  4. Jakob H Macke
  1. University of Tübingen, Germany

Abstract

Inferring parameters of computational models that capture experimental data is a central task in cognitive neuroscience. Bayesian statistical inference methods usually require the ability to evaluate the likelihood of the model—however, for many models of interest in cognitive neuroscience, the associated likelihoods cannot be computed efficiently. Simulation-based inference (SBI) offers a solution to this problem by only requiring access to simulations produced by the model. Previously, Fengler et al. introduced Likelihood Approximation Networks (LAN, Fengler et al., 2021) which make it possible to apply SBI to models of decision-making, but require billions of simulations for training. Here, we provide a new SBI method that is substantially more simulation-efficient. Our approach, Mixed Neural Likelihood Estimation (MNLE), trains neural density estimators on model simulations to emulate the simulator, and is designed to capture both the continuous (e.g., reaction times) and discrete (choices) data of decision-making models. The likelihoods of the emulator can then be used to perform Bayesian parameter inference on experimental data using standard approximate inference methods like Markov Chain Monte Carlo sampling. We demonstrate MNLE on two variants of the drift-diffusion model (DDM) and show that it is substantially more efficient than LANs: MNLE achieves similar likelihood accuracy with six orders of magnitude fewer training simulations, and is significantly more accurate than LANs when both are trained with the same budget. This enables researchers to perform SBI on custom-tailored models of decision-making, leading to fast iteration of model design for scientific discovery.

Data availability

We implemented MNLE as part of the open source package for SBI, sbi, available at https://github. com/mackelab/sbi. Code for reproducing the results presented here, and tutorials on how to apply MNLE to other simulators using sbi can be found at https://github.com/mackelab/mnle-for-ddms.

Article and author information

Author details

  1. Jan Boelts

    University of Tübingen, Tübingen, Germany
    For correspondence
    jan.boelts@uni-tuebingen.de
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-4979-7092
  2. Jan-Matthis Lueckmann

    University of Tübingen, Tübingen, Germany
    Competing interests
    The authors declare that no competing interests exist.
  3. Richard Gao

    University of Tübingen, Tübingen, Germany
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-5916-6433
  4. Jakob H Macke

    University of Tübingen, Tübingen, Germany
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-5154-8912

Funding

Deutsche Forschungsgemeinschaft (SFB 1233)

  • Jan-Matthis Lueckmann
  • Jakob H Macke

Deutsche Forschungsgemeinschaft (SPP 2041)

  • Jan Boelts
  • Jakob H Macke

Deutsche Forschungsgemeinschaft (Germany's Excellence Strategy MLCoE)

  • Jan Boelts
  • Jan-Matthis Lueckmann
  • Richard Gao
  • Jakob H Macke

Bundesministerium für Bildung und Forschung (ADIMEM,FKZ 01IS18052 A-D)

  • Jan-Matthis Lueckmann
  • Jakob H Macke

HORIZON EUROPE Marie Sklodowska-Curie Actions (101030918)

  • Richard Gao

Bundesministerium für Bildung und Forschung (Tübingen AI Center,FKZ 01IS18039A)

  • Jan Boelts
  • Jakob H Macke

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Copyright

© 2022, Boelts et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 3,781
    views
  • 727
    downloads
  • 28
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Jan Boelts
  2. Jan-Matthis Lueckmann
  3. Richard Gao
  4. Jakob H Macke
(2022)
Flexible and efficient simulation-based inference for models of decision-making
eLife 11:e77220.
https://doi.org/10.7554/eLife.77220

Share this article

https://doi.org/10.7554/eLife.77220

Further reading

    1. Neuroscience
    Kaspar E Vogt, Ashwinikumar Kulkarni ... Robert W Greene
    Research Article

    Sleep loss increases AMPA-synaptic strength and number in the neocortex. However, this is only part of the synaptic sleep loss response. We report an increased AMPA/NMDA EPSC ratio in frontal-cortical pyramidal neurons of layers 2–3. Silent synapses are absent, decreasing the plastic potential to convert silent NMDA to active AMPA synapses. These sleep loss changes are recovered by sleep. Sleep genes are enriched for synaptic shaping cellular components controlling glutamate synapse phenotype, overlap with autism risk genes, and are primarily observed in excitatory pyramidal neurons projecting intra-telencephalically. These genes are enriched with genes controlled by the transcription factor, MEF2c, and its repressor, HDAC4. Sleep genes can thus provide a framework within which motor learning and training occur mediated by the sleep-dependent oscillation of glutamate-synaptic phenotypes.

    1. Neuroscience
    Christopher Bell, Lukas Kilo ... Stefanie Ryglewski
    Research Article

    At many vertebrate synapses, presynaptic functions are tuned by expression of different Cav2 channels. Most invertebrate genomes contain only one Cav2 gene. The Drosophila Cav2 homolog, cacophony (cac), induces synaptic vesicle release at presynaptic active zones (AZs). We hypothesize that Drosophila cac functional diversity is enhanced by two mutually exclusive exon pairs that are not conserved in vertebrates, one in the voltage sensor and one in the loop binding Caβ and Gβγ subunits. We find that alternative splicing in the voltage sensor affects channel activation voltage. Only the isoform with the higher activation voltage localizes to AZs at the glutamatergic Drosophila larval neuromuscular junction and is imperative for normal synapse function. By contrast, alternative splicing at the other alternative exon pair tunes multiple aspects of presynaptic function. While expression of one exon yields normal transmission, expression of the other reduces channel number in the AZ and thus release probability. This also abolishes presynaptic homeostatic plasticity. Moreover, reduced channel number affects short-term plasticity, which is rescued by increasing the external calcium concentration to match release probability to control. In sum, in Drosophila alternative splicing provides a mechanism to regulate different aspects of presynaptic functions with only one Cav2 gene.