陀螺仪
人工智能
加速度计
计算机科学
惯性测量装置
传感器融合
均方误差
模拟
计算机视觉
工程类
数学
统计
操作系统
航空航天工程
作者
Poongavanam Palani,Siddhant Panigrahi,Sai Abhinav Jammi,Asokan Thondiyath
标识
DOI:10.1109/bibe55377.2022.00035
摘要
Human upper limb activities have been extensively researched in the field of biomechanics, rehabilitation, motion tracking, and augmented reality. Tracking joint angles to evaluate flexibility and overall range of motion is a widely accepted technique for human motion analysis. Traditionally joint angle evaluation is performed with the help of inertial measurement units including gyroscope, accelerometers, and other sensor modalities. However, it has some inherent limitations such as the increase in sensor drift and change in the rate of error because these sensors are susceptible to inaccuracies due to gravity and electrical interferences from the surroundings. The other alternatives are magnetic, ultrasound and marker-based visual sensors which are accurate but expensive, limiting their usage only to clinical settings. Hence the fusion of a low-cost vision-based sensor using computer vision with an inertial measurement unit is proposed in this paper to perform real-time joint angle estimation. This paper presents a robust, low-cost, and portable platform using inertial and vision sensors for real-time joint angle tracking for rehabilitative tasks. The proposed platform consists of two IMUs mounted over the upper arm and the forearm. The marker-less vision-based sensor using the mediapipe framework, tracks landmark key points. The observed RMSE for IMU-based estimation is 6.30 degrees and the vision-based sensor is 7.70 degrees. The efficacy of the proposed methodology was evaluated by simple rehabilitative exercises performed on healthy participants. Statistical analysis of the experimental results demonstrates that fused sensor output has reduced the RMSE by 6.18 degrees and has a close correlation with the ground truth values.
科研通智能强力驱动
Strongly Powered by AbleSci AI