Evolutionary Architecture Search for Generative Adversarial Networks Based on Weight Sharing
Journal Article
·
· IEEE Transactions on Evolutionary Computation
- Nanjing Univ. of Information Science and Technology (China)
- Nanjing Univ. of Information Science and Technology (China); Univ. of Surrey, Guildford (United Kingdom)
- National Institute of Advanced Industrial Science and Technology (AIST), Tokyo (Japan); RIKEN Center for Computational Science, Kobe (Japan)
- Agency for Science, Technology and Research (A*STAR), Fusionopolis (Singapore)
- Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
Generative adversarial networks (GANs) are a powerful generative technique but frequently face challenges with training stability. Network architecture plays a significant role in determining the final output of GANs, but designing a fine architecture demands extensive domain expertise. This article aims to address this issue by searching for high-performance generator’s architectures through neural architecture search (NAS). The proposed approach, called evolutionary weight sharing GANs (EWSGAN), is based on weight sharing and comprises two steps. First, a supernet of the generator is trained using weight sharing. Second, a multiobjective evolutionary algorithm (MOEA) is employed to identify optimal subnets from the supernet. These subnets inherit weights directly from the supernet for fitness assessment. Two strategies are used to stabilize the training of the generator supernet: 1) a fair single-path sampling strategy and 2) a discarding strategy. Experimental results indicate that the architecture searched by our method achieved a new state-of-the-art among NAS–GAN methods with a Fréchet inception distance (FID) of 9.09 and an inception score (IS) of 8.99 on the CIFAR-10 dataset. Finally, it also demonstrates competitive performance on the STL-10 dataset, achieving FID of 21.89 and IS of 10.51.
- Research Organization:
- Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
- Sponsoring Organization:
- National Natural Science Foundation of China (NSFC); Natural Science Foundation of Jiangsu Province; Natural Science Foundation of the Jiangsu Higher Education Institutions of China; USDOE
- Grant/Contract Number:
- AC05-00OR22725
- OSTI ID:
- 2429864
- Journal Information:
- IEEE Transactions on Evolutionary Computation, Journal Name: IEEE Transactions on Evolutionary Computation Journal Issue: 3 Vol. 28; ISSN 1089-778X
- Publisher:
- IEEECopyright Statement
- Country of Publication:
- United States
- Language:
- English
Similar Records
Stable parallel training of Wasserstein conditional generative adversarial neural networks
Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks : *Full/Regular Research Paper submission for the symposium CSCI-ISAI: Artificial Intelligence
Scalable balanced training of conditional generative adversarial neural networks on image data
Journal Article
·
Tue Aug 02 20:00:00 EDT 2022
· Journal of Supercomputing
·
OSTI ID:1908079
Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks : *Full/Regular Research Paper submission for the symposium CSCI-ISAI: Artificial Intelligence
Conference
·
Tue Nov 30 23:00:00 EST 2021
·
OSTI ID:1877492
Scalable balanced training of conditional generative adversarial neural networks on image data
Journal Article
·
Sun Apr 25 20:00:00 EDT 2021
· Journal of Supercomputing
·
OSTI ID:1783019