Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
KYBERNET I KA ---VOLUME 3 4 ( 1 9 9 8 ) , NUMBER 4 , PAGE S 3 6 9 --3 7 4 CASCADING CLASSIFIERS
 

Summary: KYBERNET I KA --- VOLUME 3 4 ( 1 9 9 8 ) , NUMBER 4 , PAGE S 3 6 9 -- 3 7 4
CASCADING CLASSIFIERS
Ethem Alpaydin 1
Cenk Kaynak
We propose a multistage recognition method built as a cascade of a linear parametric
model and a k­nearest neighbor (k­NN) nonparametric classifier. The linear model learns a
``rule'' and the k­NN learns the ``exceptions'' rejected by the ``rule.'' Because the rule­learner
handles a large percentage of the examples using a simple and general rule, only a small
subset of the training set is stored as exceptions during training. Similarly during testing,
most patterns are handled by the rule­learner and few are handled by the exception­learner
thus causing only a small increase in memory and computation. A multistage method like
cascading is a better approach than a multiexpert method like voting where all learners are
used for all cases; the extra computation and memory for the second learner is unnecessary
if we are sufficiently certain that the first one's response is correct. We discuss how such
a system can be trained using cross validation. This method is tested on the real­world
application of handwritten digit recognition.
1. INTRODUCTION
A great percentage of the training cases in many applications can be explained
by a simple rule with a small number of exceptions. Our previous experience on
handwritten digit recognition [2] shows a small difference in accuracy between linear

  

Source: Alpaydın, Ethem - Department of Computer Engineering, Bogaziçi University

 

Collections: Computer Technologies and Information Sciences