Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
The Capacity of Monotonic Functions Joseph Sill
 

Summary: The Capacity of Monotonic Functions
Joseph Sill
Computation and Neural Systems program, California Institute of Technology,
email: joe@cs.caltech.edu
Abstract
We consider the class M of monotonically increasing binary output functions. M
has considerable practical significance in machine learning and pattern recognition
because prior information often suggests a monotonic relationship between input
and output variables. The decision boundaries of monotonic classifiers are compared
and contrasted with those of linear classifiers. M is shown to have a VC dimension
of 1, meaning that the VC bounds cannot guarantee generalization independent of
input distribution. We demonstrate that when the input distribution is taken into
account, however, the VC bounds become useful because the annealed VC entropy
of M is modest for many distributions. Techniques for estimating the capacity and
bounding the annealed VC entropy of M given the input distribution are presented
and implemented.
1 Introduction
Much of learning theory is concerned with measuring the flexibility and approximating
power of various function classes. Concepts such as capacity [1], VC dimension [2] and
effective number of parameters [3] have been developed with this goal in mind.

  

Source: Abu-Mostafa, Yaser S. - Department of Mechanical Engineering & Computer Science Department, California Institute of Technology

 

Collections: Computer Technologies and Information Sciences