Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Correspondence between neuroevolution and gradient descent

Journal Article · · Nature Communications

Abstract

We show analytically that training a neural network by conditioned stochastic mutation or neuroevolution of its weights is equivalent, in the limit of small mutations, to gradient descent on the loss function in the presence of Gaussian white noise. Averaged over independent realizations of the learning process, neuroevolution is equivalent to gradient descent on the loss function. We use numerical simulation to show that this correspondence can be observed for finite mutations, for shallow and deep neural networks. Our results provide a connection between two families of neural-network training methods that are usually considered to be fundamentally different.

Sponsoring Organization:
USDOE
Grant/Contract Number:
AC02-05CH11231
OSTI ID:
1828630
Alternate ID(s):
OSTI ID: 1855195
Journal Information:
Nature Communications, Journal Name: Nature Communications Journal Issue: 1 Vol. 12; ISSN 2041-1723
Publisher:
Nature Publishing GroupCopyright Statement
Country of Publication:
United Kingdom
Language:
English

References (28)

Taylor expansion of the accumulated rounding error journal June 1976
Approximation by superpositions of a sigmoidal function journal December 1989
Neuroevolution: from architectures to learning journal January 2008
Metropolis Monte Carlo method for Brownian dynamics simulation generalized to include hydrodynamic interactions journal August 1992
On the effectiveness of crossover in simulated evolutionary optimization journal January 1994
Theory of the Backpropagation Neural Network book January 1992
Metropolis Monte Carlo method as a numerical technique to solve the Fokker—Planck equation journal October 1991
Deep learning in neural networks: An overview journal January 2015
Learning representations by back-propagating errors journal October 1986
Designing neural networks through neuroevolution journal January 2019
Genetic Algorithms journal July 1992
Equation of State Calculations by Fast Computing Machines journal June 1953
Reversible self-assembly of patchy particles into monodisperse icosahedral clusters journal August 2007
Avoiding unphysical kinetic traps in Monte Carlo simulations of strongly attractive particles journal October 2007
Dynamic Monte Carlo versus Brownian dynamics: A comparison for self-diffusion and crystallization in colloidal fluids journal May 2010
Coarse-grained Monte Carlo simulations of non-equilibrium systems journal June 2013
Approximating the dynamical evolution of systems of strongly interacting overdamped particles journal June 2011
Monte Carlo sampling methods using Markov chains and their applications journal April 1970
Learning to grow: Control of material self-assembly using evolutionary reinforcement learning journal May 2020
Revisiting the slow dynamics of a silica melt using Monte Carlo simulations journal July 2007
Nonuniversal critical dynamics in Monte Carlo simulations journal January 1987
Collective Monte Carlo Updating for Spin Systems journal January 1989
Rejection-Free Geometric Cluster Algorithm for Complex Fluids journal January 2004
How to simulate patchy particles journal May 2018
Simple Evolutionary Optimization Can Rival Stochastic Gradient Descent in Neural Networks
  • Morse, Gregory; Stanley, Kenneth O.
  • GECCO '16: Genetic and Evolutionary Computation Conference, Proceedings of the Genetic and Evolutionary Computation Conference 2016 https://doi.org/10.1145/2908812.2908916
conference July 2016
Statistical Mechanics of Deep Learning journal March 2020
Evolving Neural Networks through Augmenting Topologies journal June 2002
Backpropagation Applied to Handwritten Zip Code Recognition journal December 1989

Similar Records

Using the Metropolis algorithm to explore the loss surface of a recurrent neural network
Journal Article · Fri Dec 20 23:00:00 EST 2024 · Journal of Chemical Physics · OSTI ID:2574501

Convergence of Hyperbolic Neural Networks Under Riemannian Stochastic Gradient Descent
Journal Article · Thu Oct 05 00:00:00 EDT 2023 · Communications on Applied Mathematics and Computation · OSTI ID:2007675

Related Subjects