Scratch-AID: a deep-learning based system for automatic detection of mouse scratching behavior with high accuracy

  1. Huasheng Yu  Is a corresponding author
  2. Jingwei Xiong
  3. Adam Yongxin Ye
  4. Suna Li Cranfill
  5. Tariq Cannonier
  6. Mayank Gautam
  7. Marina Zhang
  8. Rayan Bilal
  9. Jong-Eun Park
  10. Yuji Xue
  11. Vidhur Polam
  12. Zora Vujovic
  13. Daniel Dai
  14. William Ong
  15. Jasper Ip
  16. Amanda Hsieh
  17. Nour Mimouni
  18. Alejandra Lozada
  19. Medhini Sosale
  20. Alex Ahn
  21. Minghong Ma
  22. Long Ding
  23. Javier Arsuaga
  24. Wenqin Luo  Is a corresponding author
  1. University of Pennsylvania, United States
  2. University of California, Davis, United States
  3. Howard Hughes Medical Institute, Harvard Medical School, United States
  4. Massachusetts Institute of Technology, United States

Abstract

Mice are the most commonly used model animals for itch research and for development of anti-itch drugs. Most labs manually quantify mouse scratching behavior to assess itch intensity. This process is labor-intensive and limits large-scale genetic or drug screenings. In this study, we developed a new system, Scratch-AID Automatic Itch Detection), which could automatically identify and quantify mouse scratching behavior with high accuracy. Our system included a custom-designed videotaping box to ensure high-quality and replicable mouse behavior recording and a convolutional recurrent neural network (CRNN) trained with frame-labeled mouse scratching behavior videos, induced by nape injection of chloroquine (CQ). The best trained network achieved 97.6% recall and 96.9% precision on previously unseen test videos. Remarkably, Scratch-AID could reliably identify scratching behavior in other major mouse itch models, including the acute cheek model, the histaminergic model, and a chronic itch model. Moreover, our system detected significant differences in scratching behavior between control and mice treated with an anti-itch drug. Taken together, we have established a novel deep learning-based system that is ready to replace manual quantification for mouse scratching behavior in different itch models and for drug screening.

Data availability

The training and test videos generated during the current study can be downloaded from DRYAD (https://doi.org/10.5061/dryad.mw6m9060s). The codes for model training and test can be downloaded from GitHub (https://github.com/taimeimiaole/Scratch-AID)

The following data sets were generated

Article and author information

Author details

  1. Huasheng Yu

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    For correspondence
    huasheng.yu@pennmedicine.upenn.edu
    Competing interests
    The authors declare that no competing interests exist.
  2. Jingwei Xiong

    Graduate Group in Biostatistics, University of California, Davis, Davis, United States
    Competing interests
    The authors declare that no competing interests exist.
  3. Adam Yongxin Ye

    Program in Cellular and Molecular Medicine, Howard Hughes Medical Institute, Harvard Medical School, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
  4. Suna Li Cranfill

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-3431-0061
  5. Tariq Cannonier

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  6. Mayank Gautam

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-7257-5837
  7. Marina Zhang

    Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.
  8. Rayan Bilal

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  9. Jong-Eun Park

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  10. Yuji Xue

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  11. Vidhur Polam

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  12. Zora Vujovic

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  13. Daniel Dai

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  14. William Ong

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  15. Jasper Ip

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-9773-1544
  16. Amanda Hsieh

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  17. Nour Mimouni

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  18. Alejandra Lozada

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  19. Medhini Sosale

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  20. Alex Ahn

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  21. Minghong Ma

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  22. Long Ding

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-1716-3848
  23. Javier Arsuaga

    Graduate Group in Biostatistics, University of California, Davis, Davis, United States
    Competing interests
    The authors declare that no competing interests exist.
  24. Wenqin Luo

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    For correspondence
    luow@pennmedicine.upenn.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-2486-807X

Funding

National Science Foundation (DMS-1854770)

  • Javier Arsuaga

National Institutes of Health (R01 NS083702)

  • Wenqin Luo

National Institutes of Health (R34 NS118411)

  • Long Ding
  • Wenqin Luo

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: Mice were housed in the John Morgan animal facility at the University of Pennsylvania. All animal treatments were conducted in accordance with protocols approved by the Institutional Animal Care and Use Committee and the guidelines of the National Institutes of Health (Protocol No. 804886).

Copyright

© 2022, Yu et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 3,747
    views
  • 398
    downloads
  • 5
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Huasheng Yu
  2. Jingwei Xiong
  3. Adam Yongxin Ye
  4. Suna Li Cranfill
  5. Tariq Cannonier
  6. Mayank Gautam
  7. Marina Zhang
  8. Rayan Bilal
  9. Jong-Eun Park
  10. Yuji Xue
  11. Vidhur Polam
  12. Zora Vujovic
  13. Daniel Dai
  14. William Ong
  15. Jasper Ip
  16. Amanda Hsieh
  17. Nour Mimouni
  18. Alejandra Lozada
  19. Medhini Sosale
  20. Alex Ahn
  21. Minghong Ma
  22. Long Ding
  23. Javier Arsuaga
  24. Wenqin Luo
(2022)
Scratch-AID: a deep-learning based system for automatic detection of mouse scratching behavior with high accuracy
eLife 11:e84042.
https://doi.org/10.7554/eLife.84042

Share this article

https://doi.org/10.7554/eLife.84042

Further reading

    1. Neuroscience
    Yisi Liu, Pu Wang ... Hongwei Zhou
    Short Report

    The increasing use of tissue clearing techniques underscores the urgent need for cost-effective and simplified deep imaging methods. While traditional inverted confocal microscopes excel in high-resolution imaging of tissue sections and cultured cells, they face limitations in deep imaging of cleared tissues due to refractive index mismatches between the immersion media of objectives and sample container. To overcome these challenges, the RIM-Deep was developed to significantly improve deep imaging capabilities without compromising the normal function of the confocal microscope. This system facilitates deep immunofluorescence imaging of the prefrontal cortex in cleared macaque tissue, extending imaging depth from 2 mm to 5 mm. Applied to an intact and cleared Thy1-EGFP mouse brain, the system allowed for clear axonal visualization at high imaging depth. Moreover, this advancement enables large-scale, deep 3D imaging of intact tissues. In principle, this concept can be extended to any imaging modality, including existing inverted wide-field, confocal, and two-photon microscopy. This would significantly upgrade traditional laboratory configurations and facilitate the study of connectomes in the brain and other tissues.

    1. Neuroscience
    2. Physics of Living Systems
    Moritz Schloetter, Georg U Maret, Christoph J Kleineidam
    Research Article

    Neurons generate and propagate electrical pulses called action potentials which annihilate on arrival at the axon terminal. We measure the extracellular electric field generated by propagating and annihilating action potentials and find that on annihilation, action potentials expel a local discharge. The discharge at the axon terminal generates an inhomogeneous electric field that immediately influences target neurons and thus provokes ephaptic coupling. Our measurements are quantitatively verified by a powerful analytical model which reveals excitation and inhibition in target neurons, depending on position and morphology of the source-target arrangement. Our model is in full agreement with experimental findings on ephaptic coupling at the well-studied Basket cell-Purkinje cell synapse. It is able to predict ephaptic coupling for any other synaptic geometry as illustrated by a few examples.