DiverseReID: Towards generalizable person re-identification via Dynamic Style Hallucination and decoupled domain experts.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
May 24, 2025
Abstract
Person re-identification (re-ID) models often fail to generalize well when deployed to other camera networks with domain shift. A classical domain generalization (DG) solution is to enhance the diversity of source data so that a model can learn more domain-invariant, and hence generalizable representations. Existing methods typically mix images from different domains in a mini-batch to generate novel styles, but the mixing coefficient sampled from predefined Beta distribution requires careful manual tuning and may render sub-optimal performance. To this end, we propose a plug-and-play Dynamic Style Hallucination (DSH) module that adaptively adjusts the mixing weights based on the style distribution discrepancy between image pairs, which is dynamically measured with the reciprocal of Wasserstein distances. This approach not only reduces the tedious manual tuning of parameters but also significantly enriches style diversity by expanding the perturbation space to the utmost. In addition, to promote inter-domain diversity, we devise a Domain Experts Decoupling (DED) loss, which constrains features from one domain to go towards the orthogonal direction against features from other domains. The proposed approach, dubbed DiverseReID, is parameter-free and computationally efficient. Without bells and whistles, it outperforms the state-of-the-art on various DG re-ID benchmarks. Experiments verify that style diversity, not just the size of the training data, is crucial for enhancing generalization.