Artificial intelligence-enhanced 3D gait analysis with a single consumer-grade camera.

Journal: Journal of biomechanics
Published Date:

Abstract

Gait analysis is crucial for diagnosing and monitoring various healthcare conditions, but traditional marker-based motion capture (MoCap) systems require expensive equipment, extensive setup, and trained personnel, limiting their accessibility in clinical and home settings. Markerless systems reduce setup complexity but often require multiple cameras, fixed calibration, and are not designed for widespread clinical adoption. This study introduces 3DGait, an artificial intelligence-enhanced markerless 3-Dimensional gait analysis system that operates with a single consumer-grade depth camera, providing a streamlined, accessible alternative. The system integrates advanced machine learning algorithms to produce 49 angular, spatial, and temporal gait biomarkers commonly used in mobility analysis. We validated 3DGait against a marker-based MoCap (OptiTrack) using 16 trials from 8 healthy adults performing the Timed Up and Go (TUG) test. The system achieved an overall average mean absolute error (MAE) of 2.3°, with all MAE under 5.2°, and a Pearson's correlation coefficient (PCC) of 0.75 for angular biomarkers. All spatiotemporal biomarkers had errors no greater than 15 %. Temporal biomarkers (excluding TUG time) had errors under 0.03 s, corresponding to one video frame at 30 frames per second. These results demonstrate that 3DGait provides clinically acceptable gait metrics relative to marker-based MoCap, while eliminating the need for markers, calibration, or fixed camera placement. 3DGait's accessible, non-invasive and single camera design makes it practical for use in non-specialist clinics and home settings, supporting patient monitoring and chronic disease management. Future research will focus on validating 3DGait with diverse populations, including individuals with gait abnormalities, to broaden its clinical applications.

Authors

  • Ling Guo
    Department of Nephrology, Qilu Hospital, Cheeloo College of Medicine, Shandong University, Ji'nan, China. Electronic address: gulixiji@sdu.edu.cn.
  • Richard Chang
    Institute for Infocomm Research (I2R), Agency for Science, Technology and Research (A*STAR), 1 Fusionopolis Way, #21-01, Connexis South Tower, Singapore 138632, Singapore.
  • Jie Wang
  • Amudha Narayanan
    Institute for Infocomm Research (I2R), Agency for Science, Technology and Research (A*STAR), Singapore.
  • Peisheng Qian
  • Mei Chee Leong
    Institute for Infocomm Research (I2R), Agency for Science, Technology and Research (A*STAR), Singapore.
  • Partha Pratim Kundu
    Carecam Pte Ltd., Singapore; Institute for Infocomm Research (I2R), Agency for Science, Technology and Research (A*STAR), Singapore.
  • Sriram Senthilkumar
    Carecam Pte Ltd., Singapore.
  • Sai Chaitanya Garlapati
    Carecam Pte Ltd., Singapore.
  • Elson Ching Kiat Yong
    Carecam Pte Ltd., Singapore.
  • Ramanpreet Singh Pahwa
    Institute for Infocomm Research (I2R), Agency for Science, Technology and Research (A*STAR), 1 Fusionopolis Way, #21-01, Connexis South Tower, Singapore 138632, Singapore.