Self-supervised spatial-temporal contrastive network for EEG-based brain network classification.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Electroencephalogram (EEG)-based brain network analysis has shown promise in brain disease research by revealing the complex connectivity among brain regions. However, existing methods struggle to fully utilize the large amounts of unlabeled data to capture both spatial and temporal relationships across the brain. To mitigate the costs associated with annotating brain data and to extract advanced feature representations, we introduce a novel Self-Supervised Spatial-Temporal Contrastive Network (SS-STCN) framework tailored for brain network classification. Within this framework, the pre-processed unlabeled data are perturbed with transformations and fed into a pre-training contrastive module to train attention-driven two-stream encoders which comprise a Spatial Graph Attention Network (SGAT) and a Temporal Bi-directional Long Short-Term Memory (TBLSTM) network. After optimization, the hybrid networks are then utilized to extract salient features from each labeled sample, achieving spatial-temporal feature fusion. Extensive experiments on the CHB-MIT and Deap datasets show that SS-STCN outperforms existing supervised and unsupervised methods, demonstrating strong accuracy and generalizability across epilepsy classification and emotion recognition tasks.

Authors

  • Changxu Dong
    School of Artificial Intelligence, Anhui University, Hefei 230601, P. R. China.
  • Dengdi Sun
    School of Artificial Intelligence, Anhui University, Hefei 230601, P. R. China.
  • Bin Luo