Evolutionary architecture search for generative adversarial networks using an aging mechanism-based strategy.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
Nov 9, 2024
Abstract
Generative Adversarial Networks (GANs) have emerged as a key technology in artificial intelligence, especially in image generation. However, traditionally hand-designed GAN architectures often face significant training stability challenges, which are effectively addressed by our Evolutionary Neural Architecture Search (ENAS) algorithm for GANs, named EAMGAN. This one-shot model automates the design of GAN architectures and employs an Operation Importance Metric (OIM) to enhance training stability. It also incorporates an aging mechanism to optimize the selection process during architecture search. Additionally, the use of a non-dominated sorting algorithm ensures the generation of Pareto-optimal solutions, promoting diversity and preventing premature convergence. We evaluated our method on benchmark datasets, and the results demonstrate that EAMGAN is highly competitive in terms of efficiency and performance. Our method identified an architecture achieving Inception Scores (IS) of 8.83±0.13 and Fréchet Inception Distance (FID) of 9.55 on CIFAR-10 with only 0.66 GPU days. Results on the STL-10, CIFAR-100, and ImageNet32 datasets further demonstrate the robust portability of our architecture.