Visualizing Inertial Data For Wearable Sensor Based Daily Life Activity Recognition Using Convolutional Neural Network.
Journal:
Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference
Published Date:
Jul 1, 2019
Abstract
Nowadays human activity recognition (HAR) plays an crucial role in the healthcare and wellness domains, for example, HAR contributes to context-aware systems like elder home assistance and care as a core technology. Despite promising performance in terms of recognition accuracy achieved by the advancement of machine learning for classification tasks, most of the existing HAR approaches, which adopt low-level handcrafted features, cannot completely deal with practical activities. Therefore, in this paper, we present an efficient wearable sensor based activity recognition method that allows encoding inertial data into color image data for learning highly discriminative features by convolutional neural networks (CNNs). The proposed data encoding technique converts tri-axial samples to color pixels and then arranges them for image-formed representation. Our method reaches the recognition accuracy of over 95% on two challenging activities datasets and further outperforms other deep learning-based HAR approaches.