Enhancing Open-Set Domain Adaptation through Optimal Transport and Adversarial Learning.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
Nov 20, 2024
Abstract
Open-Set Domain Adaptation (OSDA) is designed to facilitate the transfer of knowledge from a source domain to a target domain, where the class space of the source is a subset of the target. The primary challenge in OSDA is the identification of shared samples in the target domain to achieve domain alignment while effectively segregating private samples in the target domain. In attempts to address this challenge, numerous existing methods leverage weighted classifiers to mitigate the negative transfer issue induced by private classes in the target domain and recognize all these samples as a whole unknown class. However, this strategy may result in inadequate acquisition of discriminative information within the target domain and an unclear decision boundaries. To overcome these limitations, we propose a novel framework termed Optimal Transport and Adversarial Learning (OTAL). Our approach innovatively introduces Optimal Transport (OT) with a similarity matrix for feature-to-prototype mapping in clustering, enabling the model to learn discriminative information and capturing the intrinsic structure of the target domain. Furthermore, we introduce a three-way domain discriminator to aid in the construction of decision boundary between known and unknown classes, while simultaneously aligning the distribution of known samples. Experimental results on three image classification datasets (Office-31, Office-Home and VisDA-2017) demonstrate the superior performance of OTAL when compared to existing state-of-the-art methods.