Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
1502 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 6, NOVEMBER 1999 The "Weight Smoothing" Regularization of MLP for Jacobian Stabilization
 

Summary: 1502 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 6, NOVEMBER 1999
The "Weight Smoothing" Regularization of MLP for Jacobian Stabilization
Filipe Aires, Michel Schmitt, Alain Chedin, and No¨elle Scott
Abstract-- In an approximation problem with a neural net-
work, a low-output root mean square (rms) error is not always
a universal criterion. In this paper, we investigate problems
where the Jacobians--first derivative of an output value with
respect to an input value--of the approximation model are
needed and propose to add a quality criterion on these Jacobians
during the learning step. More specifically, we focus here on
the approximation of functionals A; from a space of continuous
functions (discretized in pratice) to a scalar space. In this case, the
approximation is confronted with the compensation phenomenon:
a lower contribution of one input can be compensated by a
larger one of its neighboring inputs. In this case, profiles (with
respect to the input index) of neural Jacobians are very irregular
instead of smooth. Then, the approximation of A becomes an
ill-posed problem because many solutions can be chosen by the
learning process. We propose to introduce the smoothness of
Jacobian profiles as an a priori information via a regularization

  

Source: Aires, Filipe - Laboratoire de Météorologie Dynamique du CNRS, Université Pierre-et-Marie-Curie, Paris 6
Goddard Institute for Space Studies (NASA)

 

Collections: Environmental Sciences and Ecology; Geosciences