Error minimizing algorithms for nearest eighbor classifiers
- Los Alamos National Laboratory
- TEXAS A&M
Stack Filters define a large class of discrete nonlinear filter first introd uced in image and signal processing for noise removal. In recent years we have suggested their application to classification problems, and investigated their relationship to other types of discrete classifiers such as Decision Trees. In this paper we focus on a continuous domain version of Stack Filter Classifiers which we call Ordered Hypothesis Machines (OHM), and investigate their relationship to Nearest Neighbor classifiers. We show that OHM classifiers provide a novel framework in which to train Nearest Neighbor type classifiers by minimizing empirical error based loss functions. We use the framework to investigate a new cost sensitive loss function that allows us to train a Nearest Neighbor type classifier for low false alarm rate applications. We report results on both synthetic data and real-world image data.
- Research Organization:
- Los Alamos National Laboratory (LANL), Los Alamos, NM (United States)
- Sponsoring Organization:
- USDOE
- DOE Contract Number:
- AC52-06NA25396
- OSTI ID:
- 1045407
- Report Number(s):
- LA-UR-11-00020; LA-UR-11-20; TRN: US201215%%19
- Resource Relation:
- Conference: SPIE Electronic Imaging ; January 23, 2011 ; San Francisco, CA
- Country of Publication:
- United States
- Language:
- English
Similar Records
Improving the Accuracy of Nearest-Neighbor Classification Using Principled Construction and Stochastic Sampling of Training-Set Centroids
Data fusion on a distributed heterogeneous sensor network.