Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
In International Conference on Artificial Intelligence (IC-AI'99), June 28-July 1 1999, Las Vegas. RE-VISITING BACKPROPAGATION NETWORK
 

Summary: In International Conference on Artificial Intelligence (IC-AI'99), June 28-July 1 1999, Las Vegas.
RE-VISITING BACKPROPAGATION NETWORK
OPTIMIZATION: TOWARDS MAXIMALLY PRUNED NETWORKS
A Goh and SSR Abidi
USM Computer Sciences
11800 Penang, Malaysia
Email: alwyn@cs.usm.my
Abstract
Backpropagation (BP) Neural Network (NN)
error functions enable the mapping of data
vectors to user-defined classifications by driving
weight matrix modifications so as to reduce
classification error over the training data set.
Conventional BP error functions are usually
only implicitly dependant on the weight matrix,
however an explicit penalty term can be added
so as to force numerically insignificant weights
closer to zero. In our investigation, BP training
is undertaken as a prelude to a pruning stage that
selectively removes functionally unimportant

  

Source: Abidi, Syed Sibte Raza - Faculty of Computer Science, Dalhousie University

 

Collections: Computer Technologies and Information Sciences