Real-time, low-latency closed-loop feedback using markerless posture tracking

Elife. 2020 Dec 8:9:e61909. doi: 10.7554/eLife.61909.

Abstract

The ability to control a behavioral task or stimulate neural activity based on animal behavior in real-time is an important tool for experimental neuroscientists. Ideally, such tools are noninvasive, low-latency, and provide interfaces to trigger external hardware based on posture. Recent advances in pose estimation with deep learning allows researchers to train deep neural networks to accurately quantify a wide variety of animal behaviors. Here, we provide a new <monospace>DeepLabCut-Live!</monospace> package that achieves low-latency real-time pose estimation (within 15 ms, >100 FPS), with an additional forward-prediction module that achieves zero-latency feedback, and a dynamic-cropping mode that allows for higher inference speeds. We also provide three options for using this tool with ease: (1) a stand-alone GUI (called <monospace>DLC-Live! GUI</monospace>), and integration into (2) <monospace>Bonsai,</monospace> and (3) <monospace>AutoPilot</monospace>. Lastly, we benchmarked performance on a wide range of systems so that experimentalists can easily decide what hardware is required for their needs.

Keywords: DeepLabCut; any animal; computational biology; low-latency; mouse; neuroscience; pose-estimation; real-time tracking; systems biology.

Publication types

  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Animals
  • Behavior, Animal / physiology
  • Feedback, Physiological / physiology*
  • Mice
  • Neural Networks, Computer
  • Posture / physiology*
  • Software