Robust adaptation regularization based on within-class scatter for domain adaptation.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

In many practical applications, the assumption that the distributions of the data employed for training and test are identical is rarely valid, which would result in a rapid decline in performance. To address this problem, the domain adaptation strategy has been developed in recent years. In this paper, we propose a novel unsupervised domain adaptation method, referred to as Robust Adaptation Regularization based on Within-Class Scatter (WCS-RAR), to simultaneously optimize the regularized loss, the within-class scatter, the joint distribution between domains, and the manifold consistency. On the one hand, to make the model robust against outliers, we adopt an l-norm based loss function in virtue of its row sparsity, instead of the widely-used l-norm based squared loss or hinge loss function to determine the residual. On the other hand, to well preserve the structure knowledge of the source data within the same class and strengthen the discriminant ability of the classifier, we incorporate the minimum within-class scatter into the process of domain adaptation. Lastly, to efficiently solve the resulting optimization problem, we extend the form of the Representer Theorem through the kernel trick, and thus derive an elegant solution for the proposed model. The extensive comparison experiments with the state-of-the-art methods on multiple benchmark data sets demonstrate the superiority of the proposed method.

Authors

  • Liran Yang
    College of Information and Electrical Engineering, China Agricultural University, Beijing, 100083, China.
  • Ping Zhong
    College of Science, China Agricultural University, Beijing, 100083, China. Electronic address: zping@cau.edu.cn.