CLUMM: Contrastive Learning for Unobtrusive Motion Monitoring.

Journal: Sensors (Basel, Switzerland)
PMID:

Abstract

Traditional approaches for human monitoring and motion recognition often rely on wearable sensors, which, while effective, are obtrusive and cause significant discomfort to workers. More recent approaches have employed unobtrusive, real-time sensing using cameras mounted in the manufacturing environment. While these methods generate large volumes of rich data, they require extensive labeling and analysis for machine learning applications. Additionally, these cameras frequently capture irrelevant environmental information, which can hinder the performance of deep learning algorithms. To address these limitations, this paper introduces a novel framework that leverages a contrastive learning approach to learn rich representations from raw images without the need for manual labeling. This framework mitigates the effect of environmental complexity by focusing on critical joint coordinates relevant to manufacturing tasks. This approach ensures that the model learns directly from human-specific data, effectively reducing the impact of the surrounding environment. A custom dataset of human subjects simulating various tasks in a workplace setting is used for training and evaluation. By fine-tuning the learned model for a downstream motion classification task, we achieve up to 90% accuracy, demonstrating the effectiveness of our proposed solution in real-time human motion monitoring.

Authors

  • Pius Gyamenah
    The School of Manufacturing Systems and Networks, Ira A. Fulton Schools of Engineering, Arizona State University, Mesa, AZ 85212, USA.
  • Hari Iyer
    The Polytechnic School, Ira A. Fulton Schools of Engineering, Arizona State University, Mesa, AZ 85212, USA.
  • Heejin Jeong
    The Polytechnic School, Ira A. Fulton Schools of Engineering, Arizona State University, Mesa, AZ, USA.
  • Shenghan Guo
    The School of Manufacturing Systems and Networks, Arizona State University, Mesa, AZ 85212, USA.