Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network

  Advanced Search  

Optimizing 0/1 Loss for Perceptrons by Random Coordinate Descent

Summary: Optimizing 0/1 Loss for Perceptrons
by Random Coordinate Descent
Ling Li and Hsuan-Tien Lin
Abstract--The 0/1 loss is an important cost function for
perceptrons. Nevertheless it cannot be easily minimized by
most existing perceptron learning algorithms. In this paper, we
propose a family of random coordinate descent algorithms to
directly minimize the 0/1 loss for perceptrons, and prove their
convergence. Our algorithms are computationally efficient, and
usually achieve the lowest 0/1 loss compared with other algo-
rithms. Such advantages make them favorable for nonseparable
real-world problems. Experiments show that our algorithms are
especially useful for ensemble learning, and could achieve the
lowest test error for many complex data sets when coupled with
The perceptron was first introduced by Rosenblatt [1] as a
probabilistic model for information processing in the brain.
It is simply a linear threshold classifier, which can be thought
as a hyperplane in the input space.


Source: Abu-Mostafa, Yaser S. - Department of Mechanical Engineering & Computer Science Department, California Institute of Technology


Collections: Computer Technologies and Information Sciences