Back to Search Start Over

SLEAP: A deep learning system for multi-animal pose tracking.

Authors :
Pereira TD
Tabris N
Matsliah A
Turner DM
Li J
Ravindranath S
Papadoyannis ES
Normand E
Deutsch DS
Wang ZY
McKenzie-Smith GC
Mitelut CC
Castro MD
D'Uva J
Kislin M
Sanes DH
Kocher SD
Wang SS
Falkner AL
Shaevitz JW
Murthy M
Source :
Nature methods [Nat Methods] 2022 Apr; Vol. 19 (4), pp. 486-495. Date of Electronic Publication: 2022 Apr 04.
Publication Year :
2022

Abstract

The desire to understand how the brain generates and patterns behavior has driven rapid methodological innovation in tools to quantify natural animal behavior. While advances in deep learning and computer vision have enabled markerless pose estimation in individual animals, extending these to multiple animals presents unique challenges for studies of social behaviors or animals in their natural environments. Here we present Social LEAP Estimates Animal Poses (SLEAP), a machine learning system for multi-animal pose tracking. This system enables versatile workflows for data labeling, model training and inference on previously unseen data. SLEAP features an accessible graphical user interface, a standardized data model, a reproducible configuration system, over 30 model architectures, two approaches to part grouping and two approaches to identity tracking. We applied SLEAP to seven datasets across flies, bees, mice and gerbils to systematically evaluate each approach and architecture, and we compare it with other existing approaches. SLEAP achieves greater accuracy and speeds of more than 800 frames per second, with latencies of less than 3.5 ms at full 1,024 × 1,024 image resolution. This makes SLEAP usable for real-time applications, which we demonstrate by controlling the behavior of one animal on the basis of the tracking and detection of social interactions with another animal.<br /> (© 2022. The Author(s).)

Details

Language :
English
ISSN :
1548-7105
Volume :
19
Issue :
4
Database :
MEDLINE
Journal :
Nature methods
Publication Type :
Academic Journal
Accession number :
35379947
Full Text :
https://doi.org/10.1038/s41592-022-01426-1