Optimizing Latent Distributions for Non-Adversarial Generative Networks.

Journal: IEEE transactions on pattern analysis and machine intelligence
Published Date:

Abstract

The generator in generative adversarial networks (GANs) is driven by a discriminator to produce high-quality images through an adversarial game. At the same time, the difficulty of reaching a stable generator has been increased. This paper focuses on non-adversarial generative networks that are trained in a plain manner without adversarial loss. The given limited number of real images could be insufficient to fully represent the real data distribution. We therefore investigate a set of distributions in a Wasserstein ball centred on the distribution induced by the training data and propose to optimize the generator over this Wasserstein ball. We theoretically discuss the solvability of the newly defined objective function and develop a tractable reformulation to learn the generator. The connections and differences between the proposed non-adversarial generative networks and GANs are analyzed. Experimental results on real-world datasets demonstrate that the proposed algorithm can effectively learn image generators in a non-adversarial approach, and the generated images are of comparable quality with those from GANs.

Authors

  • Tianyu Guo
    Jiangsu Co-Innovation Center of Efficient Processing and Utilization of Forest Resources, Nanjing Forestry University, Nanjing 210037, China. gty@njfu.edu.cn.
  • Chang Xu
    Institute of Cardio-Cerebrovascular Medicine, Central Hospital of Dalian University of Technology, Dalian 116089, China.
  • Boxin Shi
  • Chao Xu
    Department of Neurology, the Second Affiliated Hospital, Zhejiang University School of Medicine, Hangzhou 310009, China;Department of Emergency, Zhejiang Hospital, Hangzhou 310013, China.
  • Dacheng Tao