Bipartite Graph Adversarial Network for Subject-Independent Emotion Recognition.

Journal: IEEE journal of biomedical and health informatics
Published Date:

Abstract

Emotions play a vital role in connecting and sharing with others. However, individuals with emotional disorders face challenges in expressing their emotions, affecting their social lives. Current artificial intelligence tools support this problem by enabling the development of methods that recognize emotions from electroencephalographic (EEG) signals. However, the high variability across individuals poses challenges in developing emotion recognition methods that generalize well across different subjects. Previous studies have addressed this issue using domain adversarial neural networks (DANN), in which differences in EEG among individuals are minimized. Although DANN has shown a potential to reduce domain variance, previous studies have little explored the inclusion of layer-specific components to further advance towards that goal. This study addressed this limitation by incorporating bipartite (BP) graphs in a DANN architecture to reduce variability further. We evaluated our model on five benchmark datasets for emotion recognition (SEED, SEED-IV, SEED-V, SEED-FRA, and SEED-GER) comprising a total of 62 individuals. Our model yielded an accuracy of 82.1%, 77.3%, 85.8%, 90.7%, and 87.6% for the SEED-V, SEED-IV, SEED, SEED-FRA, and SEED-GER datasets, respectively. Notably, these accuracies are either higher or comparable to the current state-of-the-art models. Furthermore, our model identified that the frontal, temporal, and parietal EEG channels are crucial for detecting emotions evoked by audiovisual stimuli.

Authors

  • Marzieh Niaki
  • Shyamal Y Dharia
  • Yangjun Chen
  • Camilo E Valderrama
    Department of Applied Computer Science, University of Winnipeg, Winnipeg, MB, Canada.

Keywords

No keywords available for this article.