Decision level scheme for fusing multiomics and histology slide images using deep neural network for tumor prognosis prediction.
Journal:
Scientific reports
Published Date:
Jul 15, 2025
Abstract
Molecular biostatistical workflows in oncology often rely on predictive models that use multimodal data. Advances in deep learning and artificial intelligence technologies have enabled the multimodal fusion of large volumes of multimodal data. Here, we presented a decision level multimodal data fusion framework for integrating multiomics and pathological tissue slide images for prognosis prediction. Our approach established the spatial map of instances by connecting the neighboring nuclei in space and calculated the characteristic tensor via graph convolution layers for the input pathological tissue slide images. Global Average Pooling was applied to align and normalize the feature tensors from pathological images and the multiomics data, enabling seamless integration. We tested our proposed approach using Breast Invasive Carcinoma data and Non-Small Cell Lung Cancer data from the Cancer Genome Atlas, which contains paired whole-slide images, transcriptome data, genotype, epienetic, and survival information. In a 10-fold cross-validation, the comparison results demonstrated that the multimodal fusion paradigm improves outcome predictions from single modal data alone with the average C-index increasing from 0.61 to 0.52 to 0.75 and 0.67 for breast cancer and non-small cell lung cancer cohort, respectively. The proposed decision level multimodal data fusion framework is expected to provide insights and technical methodologies for the follow-up studies.