Uncertainty-Aware Graph Contrastive Fusion Network for multimodal physiological signal emotion recognition.
Journal:
Neural networks : the official journal of the International Neural Network Society
PMID:
40101553
Abstract
Graph Neural Networks (GNNs) have been widely adopted to mine topological patterns contained in physiological signals for emotion recognition. However, since physiological signals are non-stationary and susceptible to various noises, there exists inter-sensor connectivity uncertainty in each modality. Such intra-modal connectivity uncertainty may further lead to inter-modal semantic gap uncertainty, which will cause the unimodal bias problem and greatly affect the fusion effectiveness. While, such issue has never been fully considered in existing multimodal fusion models. To this end, we proposed an Uncertainty-Aware Graph Contrastive Fusion Network (UAGCFNet) to fuse multimodal physiological signals effectively for emotion recognition. Firstly, a probabilistic model-based Uncertainty-Aware Graph Convolutional Network (UAGCN), which can estimate and quantify the inter-sensor connectivity uncertainty, is constructed for each modality to extract its uncertainty-aware graph representation. Secondly, a Transitive Contrastive Fusion (TCF) module, which combines the Criss-Cross Attention (CCA)-based fusion mechanism and Transitive Contrastive Learning (TCL)-based calibration strategy organically, is designed to achieve effective fusion of multimodal graph representations by eliminating the unimodal bias problem resulting from the inter-modal semantic gap uncertainty. Extensive experimental results on DEAP, DREAMER, and MPED datasets under both subject-dependent and subject-independent scenarios demonstrate that (i) the proposed model outperforms State-Of-The-Art (SOTA) multimodal fusion models with fewer parameters and lower computational complexity; (ii) each key module and loss function contributes significantly to the performance enhancement of the proposed model; (iii) the proposed model can eliminate the unimodal bias problem effectively.