人工智能
计算机科学
姿势
人工神经网络
动物行为
深度学习
计算机视觉
图形用户界面
跟踪(教育)
机器学习
模式识别(心理学)
生物
心理学
动物
教育学
程序设计语言
作者
Talmo Pereira,Diego Aldarondo,Lindsay Willmore,Mikhail Kislin,Samuel S.‐H. Wang,Mala Murthy,Joshua W. Shaevitz
出处
期刊:Nature Methods
[Springer Nature]
日期:2018-12-20
卷期号:16 (1): 117-125
被引量:464
标识
DOI:10.1038/s41592-018-0234-5
摘要
The need for automated and efficient systems for tracking full animal pose has increased with the complexity of behavioral data and analyses. Here we introduce LEAP (LEAP estimates animal pose), a deep-learning-based method for predicting the positions of animal body parts. This framework consists of a graphical interface for labeling of body parts and training the network. LEAP offers fast prediction on new data, and training with as few as 100 frames results in 95% of peak performance. We validated LEAP using videos of freely behaving fruit flies and tracked 32 distinct points to describe the pose of the head, body, wings and legs, with an error rate of <3% of body length. We recapitulated reported findings on insect gait dynamics and demonstrated LEAP’s applicability for unsupervised behavioral classification. Finally, we extended the method to more challenging imaging situations and videos of freely moving mice. LEAP is a deep-learning-based approach for the analysis of animal pose. LEAP’s graphical user interface facilitates training of the deep network. The authors illustrate the method by analyzing Drosophila and mouse behavior.
科研通智能强力驱动
Strongly Powered by AbleSci AI