Learning-based 3D human kinematics estimation using behavioral constraints from activity classification.
Journal:
Nature communications
PMID:
40216761
Abstract
Inertial measurement units offer a cost-effective, portable alternative to lab-based motion capture systems. However, measuring joint angles and movement trajectories with inertial measurement units is challenging due to signal drift errors caused by biases and noise, which are amplified by numerical integration. Existing approaches use anatomical constraints to reduce drift but require body parameter measurements. Learning-based approaches show promise but often lack accuracy for broad applications (e.g., strength training). Here, we introduce the Activity-in-the-loop Kinematics Estimator, an end-to-end machine learning model incorporating human behavioral constraints for enhanced kinematics estimation using two inertial measurement units. It integrates activity classification with kinematics estimation, leveraging limited movement patterns during specific activities. In dynamic scenarios, our approach achieved trajectory and shoulder joint angle errors under 0.021 m and , respectively, 52% and 17% lower than errors without including activity classification. These results highlight accurate motion tracking with minimal inertial measurement units and domain-specific context.