DOE PAGES title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: An enhanced artificial neural network with a shuffled complex evolutionary global optimization with principal component analysis

Abstract

The classical Back-Propagation (BP) scheme with gradient-based optimization in training Artificial Neural Networks (ANNs) suffers from many drawbacks, such as the premature convergence, and the tendency of being trapped in local optimums. Therefore, as an alternative for the BP and gradient-based optimization schemes, various Evolutionary Algorithms (EAs), i.e., Particle Swarm Optimization (PSO), Genetic Algorithm (GA), Simulated Annealing (SA), and Differential Evolution (DE), have gained popularity in the field of ANN weight training. This study applied a new efficient and effective Shuffled Complex Evolutionary Global Optimization Algorithm with Principal Component Analysis – University of California Irvine (SP-UCI) to the weight training process of a three-layer feed-forward ANN. A large-scale numerical comparison is conducted among the SP-UCI-, PSO-, GA-, SA-, and DE-based ANNs on 17 benchmark, complex, and real-world datasets. Here the results show that SP-UCI-based ANN outperforms other EA-based ANNs in the context of convergence and generalization. Results suggest that the SP-UCI algorithm possesses good potential in support of the weight training of ANN in real-word problems. In addition, the suitability of different kinds of EAs on training ANN is discussed. The large-scale comparison experiments conducted in this paper are fundamental references for selecting proper ANN weight training algorithms in practice.

Authors:
 [1];  [2];  [2];  [2];  [2];  [2]
  1. University of California, Berkeley, CA (United States); Deltares USA Inc. Silver Spring, MD (United States)
  2. University of California, Berkeley, CA (United States)
Publication Date:
Research Org.:
University of California, Berkeley, CA (United States)
Sponsoring Org.:
USDOE Office of International Affairs (IA); California Energy Commission; MASEEH Fellowship; National Science Foundation (NSF); National Oceanic and Atmospheric Administration (NOAA); Army Research Office (ARO)
OSTI Identifier:
1538382
Alternate Identifier(s):
OSTI ID: 1549846
Grant/Contract Number:  
IA0000018; CCF-1331915; NA09NES4400006; W911NF-11-1-0422
Resource Type:
Accepted Manuscript
Journal Name:
Information Sciences
Additional Journal Information:
Journal Volume: 418-419; Journal Issue: C; Journal ID: ISSN 0020-0255
Publisher:
Elsevier
Country of Publication:
United States
Language:
English
Subject:
96 KNOWLEDGE MANAGEMENT AND PRESERVATION; 97 MATHEMATICS AND COMPUTING; SP-UCI; evolutionary algorithm; artificial neural networks; weight training; global optimization

Citation Formats

Yang, Tiantian, Asanjan, Ata Akabri, Faridzad, Mohammad, Hayatbini, Negin, Gao, Xiaogang, and Sorooshian, Soroosh. An enhanced artificial neural network with a shuffled complex evolutionary global optimization with principal component analysis. United States: N. p., 2017. Web. doi:10.1016/j.ins.2017.08.003.
Yang, Tiantian, Asanjan, Ata Akabri, Faridzad, Mohammad, Hayatbini, Negin, Gao, Xiaogang, & Sorooshian, Soroosh. An enhanced artificial neural network with a shuffled complex evolutionary global optimization with principal component analysis. United States. https://doi.org/10.1016/j.ins.2017.08.003
Yang, Tiantian, Asanjan, Ata Akabri, Faridzad, Mohammad, Hayatbini, Negin, Gao, Xiaogang, and Sorooshian, Soroosh. Fri . "An enhanced artificial neural network with a shuffled complex evolutionary global optimization with principal component analysis". United States. https://doi.org/10.1016/j.ins.2017.08.003. https://www.osti.gov/servlets/purl/1538382.
@article{osti_1538382,
title = {An enhanced artificial neural network with a shuffled complex evolutionary global optimization with principal component analysis},
author = {Yang, Tiantian and Asanjan, Ata Akabri and Faridzad, Mohammad and Hayatbini, Negin and Gao, Xiaogang and Sorooshian, Soroosh},
abstractNote = {The classical Back-Propagation (BP) scheme with gradient-based optimization in training Artificial Neural Networks (ANNs) suffers from many drawbacks, such as the premature convergence, and the tendency of being trapped in local optimums. Therefore, as an alternative for the BP and gradient-based optimization schemes, various Evolutionary Algorithms (EAs), i.e., Particle Swarm Optimization (PSO), Genetic Algorithm (GA), Simulated Annealing (SA), and Differential Evolution (DE), have gained popularity in the field of ANN weight training. This study applied a new efficient and effective Shuffled Complex Evolutionary Global Optimization Algorithm with Principal Component Analysis – University of California Irvine (SP-UCI) to the weight training process of a three-layer feed-forward ANN. A large-scale numerical comparison is conducted among the SP-UCI-, PSO-, GA-, SA-, and DE-based ANNs on 17 benchmark, complex, and real-world datasets. Here the results show that SP-UCI-based ANN outperforms other EA-based ANNs in the context of convergence and generalization. Results suggest that the SP-UCI algorithm possesses good potential in support of the weight training of ANN in real-word problems. In addition, the suitability of different kinds of EAs on training ANN is discussed. The large-scale comparison experiments conducted in this paper are fundamental references for selecting proper ANN weight training algorithms in practice.},
doi = {10.1016/j.ins.2017.08.003},
journal = {Information Sciences},
number = C,
volume = 418-419,
place = {United States},
year = {Fri Aug 04 00:00:00 EDT 2017},
month = {Fri Aug 04 00:00:00 EDT 2017}
}

Journal Article:

Citation Metrics:
Cited by: 78 works
Citation information provided by
Web of Science

Save / Share:

Works referenced in this record:

A Comparison of Genetic Algorithms, Particle Swarm Optimization and the Differential Evolution Method for the Design of Scannable Circular Antenna Arrays
journal, January 2009

  • Panduro, Marco A.; Brizuela, Carlos A.; Balderas, Luz I.
  • Progress In Electromagnetics Research B, Vol. 13
  • DOI: 10.2528/PIERB09011308

Genetic Algorithms Compared to Other Techniques for Pipe Optimization
journal, July 1994


Backpropagation neural networks
journal, February 1993


Improving the multi-objective evolutionary optimization algorithm for hydropower reservoir operations in the California Oroville–Thermalito complex
journal, July 2015


Developing reservoir monthly inflow forecasts using artificial intelligence and climate phenomenon information: RESERVOIR INFLOW FORECASTS
journal, April 2017

  • Yang, Tiantian; Asanjan, Ata Akbari; Welles, Edwin
  • Water Resources Research, Vol. 53, Issue 4
  • DOI: 10.1002/2017WR020482

Simulated annealing: A tool for operational research
journal, June 1990


Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces
journal, January 1997

  • Storn, Rainer; Price, Kenneth
  • Journal of Global Optimization, Vol. 11, Issue 4, p. 341-359
  • DOI: 10.1023/A:1008202821328

An introduction to computing with neural nets
journal, January 1987


Simulated annealing: a proof of convergence
journal, June 1994

  • Granville, V.; Krivanek, M.; Rasson, J. -P.
  • IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 16, Issue 6
  • DOI: 10.1109/34.295910

Artificial neural networks as a tool in ecological modelling, an introduction
journal, August 1999


Statistical pattern recognition: a review
journal, January 2000

  • Jain, A. K.; Duin, P. W.
  • IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 22, Issue 1
  • DOI: 10.1109/34.824819

No free lunch theorems for optimization
journal, April 1997

  • Wolpert, D. H.; Macready, W. G.
  • IEEE Transactions on Evolutionary Computation, Vol. 1, Issue 1
  • DOI: 10.1109/4235.585893

Evaluating the streamflow simulation capability of PERSIANN-CDR daily rainfall products in two river basins on the Tibetan Plateau
journal, January 2017

  • Liu, Xiaomang; Yang, Tiantian; Hsu, Koulin
  • Hydrology and Earth System Sciences, Vol. 21, Issue 1
  • DOI: 10.5194/hess-21-169-2017

Evolving artificial neural networks
journal, January 1999


Back-propagation algorithm which varies the number of hidden units
journal, January 1991


A Simplex Method for Function Minimization
journal, January 1965


Neural networks for the prediction and forecasting of water resources variables: a review of modelling issues and applications
journal, January 2000


A new evolutionary search strategy for global optimization of high-dimensional problems
journal, November 2011


Cluster-Based Population Initialization for differential evolution frameworks
journal, March 2015


Optimization by simulated annealing: Quantitative studies
journal, March 1984

  • Kirkpatrick, Scott
  • Journal of Statistical Physics, Vol. 34, Issue 5-6
  • DOI: 10.1007/BF01009452

Evolutionary artificial neural networks: a review
journal, June 2011


Genetic algorithms and neural networks: optimizing connections and connectivity
journal, August 1990


Seeker optimization algorithm for tuning the structure and parameters of neural networks
journal, February 2011


An Optimization Spiking Neural p System for Approximately Solving Combinatorial Optimization Problems
journal, May 2014

  • Zhang, Gexiang; Rong, Haina; Neri, Ferrante
  • International Journal of Neural Systems, Vol. 24, Issue 05
  • DOI: 10.1142/S0129065714400061

Effective and efficient global optimization for conceptual rainfall-runoff models
journal, April 1992

  • Duan, Qingyun; Sorooshian, Soroosh; Gupta, Vijai
  • Water Resources Research, Vol. 28, Issue 4
  • DOI: 10.1029/91WR02985

Backpropagation Learning for Multilayer Feed-Forward Neural Networks Using the Conjugate Gradient Method
journal, January 1991

  • Johansson, E. M.; Dowla, F. U.; Goodman, D. M.
  • International Journal of Neural Systems, Vol. 02, Issue 04
  • DOI: 10.1142/S0129065791000261