EmoTrans attention based emotion recognition using EEG signals and facial analysis with expert validation.
Journal:
Scientific reports
Published Date:
Jul 1, 2025
Abstract
Emotion recognition via EEG signals and facial analysis becomes one of the key aspects of human-computer interaction and affective computing, enabling scientists to gain insight into the behavior of humans. Classic emotion recognition methods usually rely on controlled stimuli, such as music and images, which does not allow for ecological validity and scope. This paper proposes the EmoTrans model, which uses the DEAP dataset to analyze physiological signals and facial video recordings. It consisted of EEG recordings from 32 viewers of 40 one-minute movie clips and facial videos from 22 participants, which can be used to analyze the emotional states based on variables; valence, arousal, dominance, and liking. To increase the model's validity, expert validation in the form of a survey by psychologists was conducted. The model integrates features extracted from EEG signals in the time, frequency, and wavelet domains as well as facial video data to provide a comprehensive understanding of emotional states. Our proposed EmoTrans architecture achieves the accuracies of 89.3%, 87.8%, 88.9%, and 89.1% for arousal, valence, dominance, and liking, respectively. EmoTrans achieved an impressive classification accuracy of 89% for emotions such as happiness, excitement, calmness, and distress, among others. Moreover, the Statistical significance of performance improvements was confirmed using a paired t-test, which showed that EmoTrans significantly outperforms baseline models. The model was validated with machine learning and deep learning classifiers and also by Leave-one-subject-out cross-validation (LOSO-CV). The proposed attention-based architecture effectively prioritizes the most relevant features from EEG and facial data, pushing the boundaries of emotion classification and offering a more nuanced understanding of human emotions across different states.