 
Summary: The Equivalence of Support Vector Machine and
Regularization Neural Networks
PEL TER ANDRAL S
Department of Psychology, University of Newcastle upon Tyne,
Newcastle upon Tyne, NE1 7RU, UK. email: peter.andras@ncl.ac.uk
Abstract. We show in this brief paper the equivalence of the support vector machine and
regularization neural networks.We prove both implication sides of the equivalence in a generally
applicable way. The novelty lies in the effective construction of the regularization operator cor
responding to a given support vector machine formulation. We give also a short introductory
description of both neural network approximation frameworks.
Key words: approximation, equivalent neural networks, regularization, support vector machine
1. Introduction
Recent papers [2, 3, 5, 7, 8, 10, 11] show that the two toplevel neural network
approximation frameworks are the application of support vector machine theory
and regularization theory to neural networks. Both frameworks fall into the general
category of Bayesian neural network methods that are based on optimization of
neural networks in the context of some prior distribution over the parameter space
of neural networks [1, 13].
In previous works Smola et al. [10] and Girosi [5] have shown that the problem
formulation of regularization neural networks can be transformed into a problem
