DeepPoseKit, a software toolkit for fast and robust animal pose estimation using deep learning.

Journal: eLife
PMID:

Abstract

Quantitative behavioral measurements are important for answering questions across scientific disciplines-from neuroscience to ecology. State-of-the-art deep-learning methods offer major advances in data quality and detail by allowing researchers to automatically estimate locations of an animal's body parts directly from images or videos. However, currently available animal pose estimation methods have limitations in speed and robustness. Here, we introduce a new easy-to-use software toolkit, , that addresses these problems using an efficient multi-scale deep-learning model, called , and a fast GPU-based peak-detection algorithm for estimating keypoint locations with subpixel precision. These advances improve processing speed >2x with no loss in accuracy compared to currently available methods. We demonstrate the versatility of our methods with multiple challenging animal pose estimation tasks in laboratory and field settings-including groups of interacting individuals. Our work reduces barriers to using advanced tools for measuring behavior and has broad applicability across the behavioral sciences.

Authors

  • Jacob M Graving
    Department of Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany.
  • Daniel Chae
    Department of Computer Science, Princeton University, Princeton, United States.
  • Hemal Naik
    Department of Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany.
  • Liang Li
    School of Psychological and Cognitive Sciences, Peking University, Beijing, 100871, China.
  • Benjamin Koger
    Department of Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany.
  • Blair R Costelloe
    Department of Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany.
  • Iain D Couzin
    Department of Collective Behaviour, Max Planck Institute for Ornithology, Konstanz, Germany.