姿势
计算机科学
人工智能
计算机视觉
三维姿态估计
跟踪(教育)
模式识别(心理学)
关节式人体姿态估计
估计
工程类
心理学
教育学
系统工程
作者
Hao-Shu Fang,Jiefeng Li,Hongyang Tang,Chao Xu,Haoyi Zhu,Yuliang Xiu,Yong–Lu Li,Cewu Lu
标识
DOI:10.1109/tpami.2022.3222784
摘要
Accurate whole-body multi-person pose estimation and tracking is an important yet challenging topic in computer vision. To capture the subtle actions of humans for complex behavior analysis, whole-body pose estimation including the face, body, hand and foot is essential over conventional body-only pose estimation. In this article, we present AlphaPose, a system that can perform accurate whole-body pose estimation and tracking jointly while running in realtime. To this end, we propose several new techniques: Symmetric Integral Keypoint Regression (SIKR) for fast and fine localization, Parametric Pose Non-Maximum-Suppression (P-NMS) for eliminating redundant human detections and Pose Aware Identity Embedding for jointly pose estimation and tracking. During training, we resort to Part-Guided Proposal Generator (PGPG) and multi-domain knowledge distillation to further improve the accuracy. Our method is able to localize whole-body keypoints accurately and tracks humans simultaneously given inaccurate bounding boxes and redundant detections. We show a significant improvement over current state-of-the-art methods in both speed and accuracy on COCO-wholebody, COCO, PoseTrack, and our proposed Halpe-FullBody pose estimation dataset. Our model, source codes and dataset are made publicly available at https://github.com/MVIG-SJTU/AlphaPose.
科研通智能强力驱动
Strongly Powered by AbleSci AI