CycleH-CUT: an unsupervised medical image translation method based on cycle consistency and hybrid contrastive learning.
Journal:
Physics in medicine and biology
PMID:
39908670
Abstract
Unsupervised medical image translation tasks are challenging due to the difficulty of obtaining perfectly paired medical images. CycleGAN-based methods have proven effective in unpaired medical image translation. However, these methods can produce artifacts in the generated medical images. To address this issue, we propose an unsupervised network based on cycle consistency and hybrid contrastive unpaired translation (CycleH-CUT). CycleH-CUT consists of two CUT (H-CUT) networks. In the H-CUT network, a query-selected attention mechanism is adopted to select queries with important features. The boosted contrastive learning loss is employed to reweight all negative patches via the optimal transport strategy. We further apply spectral normalization to improve training stability, allowing the generator to extract complex features. On the basis of the H-CUT network, a new CycleH-CUT framework is proposed to integrate contrastive learning and cycle consistency. Two H-CUT networks are used to reconstruct the generated images back to the source domain, facilitating effective translation between unpaired medical images. We conduct extensive experiments on three public datasets (BraTS, OASIS3, and IXI) and a private Spinal Column dataset to demonstrate the effectiveness of CycleH-CUT and H-CUT. Specifically, CycleH-CUT achieves an average SSIM of 0.926 in the BraTS dataset, an average SSIM of 0.796 on the OASIS3 dataset, an average SSIM of 0.932 on the IXI dataset, and an average SSIM of 0.890 on the private Spinal Column dataset.