Enhanced In-Home Human Activity Recognition Using Multimodal Sensing and Spatiotemporal Machine Learning Architecture.

Journal: Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference
PMID:

Abstract

In this research, we present an enhanced human activity recognition (HAR) framework using advanced machine learning models incorporating temporal dynamics, leveraging multimodal sensor data. Data from wearable wristbands and real-time location systems (RTLS) were used to detect human activities within a home environment. A key advancement is the development of a spatiotemporal machine learning model combining convolutional neural networks (CNN), bidirectional long short-term memory (BiLSTM), and neural structured learning (NSL), which significantly surpasses traditional machine learning baseline models like RF, SVM. We also highlight the efficacy of sensor fusion achieving an accuracy of 86.21% and an F1 score of 87.40% for routine daily activities by combining IMU and RTLS sensors, compared to using each sensor modality independently. The proposed model paves the way for developing smart environments that can intelligently adapt to the varying routines and behaviors of daily life. Our investigation shows potential applications across diverse domains, including elderly care, smart home technologies, and healthcare monitoring, suggesting the broad applicability and benefits of the developed HAR system.

Authors

  • Seyyed Mahdi Torabi
  • Mohammad Narimani
  • Edward J Park
    School of Mechatronic Systems Engineering, Simon Fraser University, Surrey, BC, Canada.