Domain-guided conditional diffusion model for unsupervised domain adaptation.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Limited transferability hinders the performance of a well-trained deep learning model when applied to new application scenarios. Recently, Unsupervised Domain Adaptation (UDA) has achieved significant progress in addressing this issue via learning domain-invariant features. However, the performance of existing UDA methods is constrained by the possibly large domain shift and limited target domain data. To alleviate these issues, we propose a Domain-guided Conditional Diffusion Model (DCDM), which generates high-fidelity target domain samples, making the transfer from source domain to target domain easier. DCDM introduces class information to control labels of the generated samples, and a domain classifier to guide the generated samples towards the target domain. Extensive experiments on various benchmarks demonstrate that DCDM brings a large performance improvement to UDA.

Authors

  • Yulong Zhang
    State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou 310027, China.
  • Shuhao Chen
    College of Information Science and Electronic Engineering, Zhejiang University, Hangzhou 310027, China.
  • Weisen Jiang
    Department of Computer Science and Engineering, Southern University of Science and Technology, Shenzhen, 518055, Guangdong, China; Department of Computer Science and Engineering, Hong Kong University of Science and Technology, 999077, Hong Kong, China. Electronic address: wjiangar@cse.ust.hk.
  • Yu Zhang
    College of Marine Electrical Engineering, Dalian Maritime University, Dalian, China.
  • Jiangang Lu
    State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou 310027, China.
  • James T Kwok