Time-series visual representations for sleep stages classification.

Journal: PloS one
Published Date:

Abstract

Polysomnography is the standard method for sleep stage classification; however, it is costly and requires controlled environments, which can disrupt natural sleep patterns. Smartwatches offer a practical, non-invasive, and cost-effective alternative for sleep monitoring. Equipped with multiple sensors, smartwatches allow continuous data collection in home environments, making them valuable for promoting health and improving sleep habits. Traditional methods for sleep stage classification using smartwatch data often rely on raw data or extracted features combined with artificial intelligence techniques. Transforming time series into visual representations enables the application of two-dimensional convolutional neural networks, which excel in classification tasks. Despite their success in other domains, these methods are underexplored for sleep stage classification. To address this, we evaluated visual representations of time series data collected from accelerometer and heart rate sensors in smartwatches. Techniques such as Gramian Angular Field, Recurrence Plots, Markov Transition Field, and spectrograms were implemented. Additionally, image patching and ensemble methods were applied to enhance classification performance. The results demonstrated that Gramian Angular Field, combined with patching and ensembles, achieved superior performance, exceeding 82% balanced accuracy for two-stage classification and 62% for three-stage classification. A comparison with traditional approaches, conducted under identical conditions, showed that the proposed method outperformed others, offering improvements of up to 8 percentage points in two-stage classification and 9 percentage points in three-stage classification. These findings show that visual representations effectively capture key sleep patterns, enhancing classification accuracy and enabling more reliable health monitoring and earlier interventions. This study highlights that visual representations not only surpass traditional methods but also emerge as a competitive and effective approach for sleep stage classification based on smartwatch data, paving the way for future research.

Authors

  • Rebeca Padovani Ederli
    Institute of Computing, University of Campinas (Unicamp), Campinas, SP, Brazil.
  • Didier A Vega-Oliveros
  • Aurea Soriano-Vargas
    Departamento Académico de Ciencia de Computación y Datos, Universidad de Ingeniería y Tecnología (UTEC), Peru.
  • Anderson Rocha
    Institute of Computing, University of Campinas, Brazil. Electronic address: anderson.rocha@ic.unicamp.br.
  • Zanoni Dias
    Institute of Computing, University of Campinas (Unicamp), Campinas, SP, Brazil.