Automated gesture tracking in head-fixed mice. Author A Giovannucci, E Pnevmatikakis, B Deverett, T Pereira, J Fondriest, M Brady, S Wang, W Abbas, P Parés, D Masip Publication Year 2018 Type Journal Article Abstract BACKGROUND: The preparation consisting of a head-fixed mouse on a spherical or cylindrical treadmill offers unique advantages in a variety of experimental contexts. Head fixation provides the mechanical stability necessary for optical and electrophysiological recordings and stimulation. Additionally, it can be combined with virtual environments such as T-mazes, enabling these types of recording during diverse behaviors.NEW METHOD: In this paper we present a low-cost, easy-to-build acquisition system, along with scalable computational methods to quantitatively measure behavior (locomotion and paws, whiskers, and tail motion patterns) in head-fixed mice locomoting on cylindrical or spherical treadmills.EXISTING METHODS: Several custom supervised and unsupervised methods have been developed for measuring behavior in mice. However, to date there is no low-cost, turn-key, general-purpose, and scalable system for acquiring and quantifying behavior in mice.RESULTS: We benchmark our algorithms against ground truth data generated either by manual labeling or by simpler methods of feature extraction. We demonstrate that our algorithms achieve good performance, both in supervised and unsupervised settings.CONCLUSIONS: We present a low-cost suite of tools for behavioral quantification, which serve as valuable complements to recording and stimulation technologies being developed for the head-fixed mouse preparation. Keywords Animals, Mice, Locomotion, Mice, Inbred C57BL, Male, Behavior, Animal, Head, Image Interpretation, Computer-Assisted, Supervised Machine Learning, Behavioral Research, Gestures Journal J Neurosci Methods Volume 300 Pages 184-195 Date Published 2018 Apr 15 ISSN Number 1872-678X DOI 10.1016/j.jneumeth.2017.07.014 Alternate Journal J Neurosci Methods PMCID PMC7957302 PMID 28728948 PubMedPubMed CentralGoogle ScholarBibTeXEndNote X3 XML