Scalable balanced training of conditional generative adversarial neural networks on image data
Journal Article
·
· Journal of Supercomputing
- Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
- Politecnico di Milano (Italy)
- Anthem, Inc., Atlanta, GA (United States)
Here, we propose a distributed approach to train deep convolutional generative adversarial neural network (DC-CGANs) models. Our method reduces the imbalance between generator and discriminator by partitioning the training data according to data labels, and enhances scalability by performing a parallel training where multiple generators are concurrently trained, each one of them focusing on a single data label. Performance is assessed in terms of inception score, Fréchet inception distance, and image quality on MNIST, CIFAR10, CIFAR100, and ImageNet1k datasets, showing a significant improvement in comparison to state-of-the-art techniques to training DC-CGANs. Weak scaling is attained on all the four datasets using up to 1000 processes and 2000 NVIDIA V100 GPUs on the OLCF supercomputer Summit.
- Research Organization:
- Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility (OLCF)
- Sponsoring Organization:
- USDOE Laboratory Directed Research and Development (LDRD) Program; USDOE Office of Science (SC)
- Grant/Contract Number:
- AC05-00OR22725
- OSTI ID:
- 1783019
- Journal Information:
- Journal of Supercomputing, Journal Name: Journal of Supercomputing Vol. 77; ISSN 0920-8542
- Publisher:
- SpringerCopyright Statement
- Country of Publication:
- United States
- Language:
- English
| Cycles in adversarial regularized learning | preprint | January 2017 |
Multiagent value iteration algorithms in dynamic programming and reinforcement learning
|
journal | December 2020 |
Characterization and computation of local Nash equilibria in continuous games
|
conference | October 2013 |
Going deeper with convolutions
|
conference | June 2015 |
Image De-Raining Using a Conditional Generative Adversarial Network
|
journal | November 2020 |
Similar Records
Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks : *Full/Regular Research Paper submission for the symposium CSCI-ISAI: Artificial Intelligence
Stable parallel training of Wasserstein conditional generative adversarial neural networks
Anderson Acceleration for Distributed Training of Deep Learning Models
Conference
·
Tue Nov 30 23:00:00 EST 2021
·
OSTI ID:1877492
Stable parallel training of Wasserstein conditional generative adversarial neural networks
Journal Article
·
Tue Aug 02 20:00:00 EDT 2022
· Journal of Supercomputing
·
OSTI ID:1908079
Anderson Acceleration for Distributed Training of Deep Learning Models
Conference
·
Mon Feb 28 23:00:00 EST 2022
·
OSTI ID:1866678