Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
448 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 8, NO. 2, MARCH 1997 How Initial Conditions Affect Generalization
 

Summary: 448 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 8, NO. 2, MARCH 1997
How Initial Conditions Affect Generalization
Performance in Large Networks
Amir Atiya and Chuanyi Ji
Abstract-- Generalization is one of the most important problems in
neural-network research. It is influenced by several factors in the network
design, such as network size, weight decay factor, and others. We show
here that the initial weight distribution (for gradient decent training
algorithms) is one other factor that influences generalization. The initial
conditions guide the training algorithm to search particular places of
the weight space. For instance small initial weights tend to result in low
complexity networks, and therefore can effectively act as a regularization
factor. We propose a novel network complexity measure, which is helpful
in shedding insight into the phenomenon, as well as in studying other
aspects of generalization.
I. INTRODUCTION
Generalization is one of the most studied topics in neural networks.
It is well known that generalization depends on three factors: the
degrees of freedom of the network related to the number of weights,
the amount of noise in the training set, and the size of the training

  

Source: Atiya, Amir - Computer Engineering Department, Cairo University

 

Collections: Computer Technologies and Information Sciences