An ensemble deep learning framework for emotion recognition through wearable devices multi-modal physiological signals.
Journal:
Scientific reports
PMID:
40383809
Abstract
The widespread availability of miniaturized wearable fitness trackers has enabled the monitoring of various essential health parameters. Utilizing wearable technology for precise emotion recognition during human and computer interactions can facilitate authentic, emotionally aware contextual communication. In this paper, an emotion recognition system is proposed for the first time to conduct an experimental analysis of both discrete and dimensional models. An ensemble deep learning architecture is considered that consists of Long Short-Term Memory and Gated Recurrent Unit models to capture dynamic temporal dependencies within emotional data sequences effectively. The publicly available wearable devices EMOGNITION database is utilized to facilitate result reproducibility and comparison. The database includes physiological signals recorded using the Samsung Galaxy Watch, Empatica E4 wristband, and MUSE 2 Electroencephalogram (EEG) headband devices for a comprehensive understanding of emotions. A detailed comparison of all three dedicated wearable devices has been carried out to identify nine discrete emotions, exploring three different bio-signal combinations. The Samsung Galaxy and MUSE 2 devices achieve an average classification accuracy of 99.14% and 99.41%, respectively. The performance of the Samsung Galaxy device is examined for the 2D Valence-Arousal effective dimensional model. Results reveal average classification accuracy of 97.81% and 72.94% for Valence and Arousal dimensions, respectively. The acquired results demonstrate promising outcomes in emotion recognition when compared with the state-of-the-art methods.