Stable parallel training of Wasserstein conditional generative adversarial neural networks
Journal Article
·
· Journal of Supercomputing
- Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
In this work, we propose a stable, parallel approach to train Wasserstein conditional generative adversarial neural networks (W-CGANs) under the constraint of a fixed computational budget. Differently from previous distributed GANs training techniques, our approach avoids inter-process communications, reduces the risk of mode collapse and enhances scalability by using multiple generators, each one of them concurrently trained on a single data label. The use of the Wasserstein metric also reduces the risk of cycling by stabilizing the training of each generator. We illustrate the approach on the CIFAR10, CIFAR100, and ImageNet1k datasets, three standard benchmark image datasets, maintaining the original resolution of the images for each dataset. Performance is assessed in terms of scalability and final accuracy within a limited fixed computational time and computational resources. To measure accuracy, we use the inception score, the Fréchet inception distance, and image quality. An improvement in inception score and Fréchet inception distance is shown in comparison to previous results obtained by performing the parallel approach on deep convolutional conditional generative adversarial neural networks as well as an improvement of image quality of the new images created by the GANs approach. Weak scaling is attained on both datasets using up to 2000 NVIDIA V100 GPUs on the OLCF supercomputer Summit.
- Research Organization:
- Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
- Sponsoring Organization:
- USDOE Laboratory Directed Research and Development (LDRD) Program
- Grant/Contract Number:
- AC05-00OR22725
- OSTI ID:
- 1908079
- Journal Information:
- Journal of Supercomputing, Journal Name: Journal of Supercomputing Journal Issue: 2 Vol. 79; ISSN 0920-8542
- Publisher:
- SpringerCopyright Statement
- Country of Publication:
- United States
- Language:
- English
Similar Records
Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks : *Full/Regular Research Paper submission for the symposium CSCI-ISAI: Artificial Intelligence
Scalable balanced training of conditional generative adversarial neural networks on image data
Evolutionary Architecture Search for Generative Adversarial Networks Based on Weight Sharing
Conference
·
Tue Nov 30 23:00:00 EST 2021
·
OSTI ID:1877492
Scalable balanced training of conditional generative adversarial neural networks on image data
Journal Article
·
Mon Apr 26 00:00:00 EDT 2021
· Journal of Supercomputing
·
OSTI ID:1783019
Evolutionary Architecture Search for Generative Adversarial Networks Based on Weight Sharing
Journal Article
·
Thu Nov 30 23:00:00 EST 2023
· IEEE Transactions on Evolutionary Computation
·
OSTI ID:2429864