 
Summary: 1
A Growing and Pruning Method for Radial Basis Function Networks
M. Bortman, M. Aladjem
Department of Electrical and Computer Engineering, BenGurion University of the Negev
P.O.Box 653, BeerSheva, 84105, Israel.
Abstract A recently published learning algorithm GGAP for radial basis function
(RBF) neural networks is studied and modified. GGAP is a growing and pruning
algorithm, which means that a created network unit that consistently makes little
contribution to the network's performance can be removed during the training. GGAP
states a formula for computing the significance of the network units, which requires a d
fold numerical integration for arbitrary probability density function ( )p x of the input
data x ( )d
x R . In this work the GGAP formula is approximated using a Gaussian
mixture model (GMM) for ( )p x and an analytical solution of the approximated unit
significance is derived. This makes it possible to employ the modified GGAP for input
data having complex and high dimensional ( )p x , which was not possible in the original
GGAP. The results of an extensive experimental study show that the modified algorithm
outperforms the original GGAP achieving both a lower prediction error and reduced
complexity of the trained network.
Index Terms Radial basis function neural networks, sequential function
