SCATrans: semantic cross-attention transformer for drug-drug interaction predication through multimodal biomedical data.

Journal: BMC bioinformatics
Published Date:

Abstract

Predicting potential drug-drug interactions (DDIs) from biomedical data plays a critical role in drug therapy, drug development, drug regulation, and public health. However, it remains challenging due to the large number of possible drug combinations, and multimodal biomedical data, which is disorder, imbalanced, more prone to linguistic errors, and difficult to label. A Semantic Cross-Attention Transformer (SCAT) model is constructed to address the above challenge. In the model, BioBERT, Doc2Vec and graph convolutional network are utilized to embed the multimodal biomedical data into vector representation, BiGRU is adopted to capture contextual dependencies in both forward and backward directions, Cross-Attention is employed to integrate the extracted features and explicitly model dependencies between them, and a feature-joint classifier is adopted to implement DDI predication (DDIP). The experiment results on the DDIExtraction-2013 dataset demonstrate that SCAT outperforms the state-of-the-art DDIP approaches. SCAT expands the application of multimodal deep learning in the field of multimodal DDIP, and can be applied to drug regulation systems to predict novel DDIs and DDI-related events.

Authors

  • Shanwen Zhang
    School of Information Engineering, Xijing University, Xi'an 710123, China.
  • Changqing Yu
    School of Electronic Information, Xijing University, Xi'an, 710123, China.
  • Chuanlei Zhang
    College of Artificial Intelligence, Tianjin University of Science and Technology, Tianjin, 300222, China.