Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
The Dependence Identification Neural Network Construction Algorithm John 0.Moody and Panos J. Antsaklis
 

Summary: The Dependence Identification Neural Network Construction Algorithm
John 0.Moody and Panos J. Antsaklis
Department of Electrical Engineering
University of Notre Dame
Notre Dame, IN 46556
Abstract
An algorithm for constructing and training multilayer neural networks, dependence identification,is
presented in this paper. Its distinctivefeatures are that (i) it transformsthe training problem into a set of
quadratic optimization problems that are solved by a number of linear equations and (ii) it constructs an
appropriate network to meet the training specifications. The architecture and network weights produced
by the algorithm can also be used as initial conditions for further on-line training by backpropagation
or a similar iterative gradient descent training algorithm if necessary. In addition to constructing an
appropriate network based on training data, the dependence identification algorithm significantlyspeeds
up learning in feedforward multilayer neural networks compared to standard backpropagation.
1 Introduction
The main tool for training multilayer neural networks is gradient descent, as used by the backpropagation
(BP) algorithm developed by Rumelhart [7]. Gradient descent algorithms are susceptible to local minima,
sensitive to initial conditions, and slow to converge. Gradient descent can work quite well with the appropri-
ate set of of initial conditions and with a proper network architecture, but using random initial conditions
and guessing at the network architecture usually leads to a slow and ponderous training process. The de-

  

Source: Antsaklis, Panos - Department of Electrical Engineering, University of Notre Dame

 

Collections: Engineering