Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

When do support vector machines work fast?

Conference ·
OSTI ID:977540

The authors establish learning rates to the Bayes risk for support vector machines (SVM's) with hinge loss. Since a theorem of Devroyte states that no learning algorithm can learn with a uniform rate to the Bayes risk for all probability distributions they have to restrict the class of considered distributions: in order to obtain fast rates they assume a noise condition recently proposed by Tsybakov and an approximation condition in terms of the distribution and the reproducing kernel Hilbert space used by the SVM. for Gaussian RBF kernels with varying widths they propose a geometric noise assumption on the distribution which ensures the approximation condition. This geometric assumption is not in terms of smoothness but describes the concentration of the marginal distribution near the decision boundary. In particular they are able to describe nontrivial classes of distributions for which SVM's using a Gaussian kernel can learn with almost linear rate.

Research Organization:
Los Alamos National Laboratory
Sponsoring Organization:
DOE
OSTI ID:
977540
Report Number(s):
LA-UR-04-2391
Country of Publication:
United States
Language:
English

Similar Records

Using support vector machines for anomalous change detonation
Conference · Thu Dec 31 23:00:00 EST 2009 · OSTI ID:1022064

Geometric Analysis Based Double Closed-loop Iterative Learning Control of Output PDF Shaping of Fiber Length Distribution in Refining Process
Journal Article · Fri Sep 07 00:00:00 EDT 2018 · IEEE Transactions on Industrial Electronics · OSTI ID:1512075

Geometric Analysis Based Double Closed-loop Iterative Learning Control of Output PDF Shaping of Fiber Length Distribution in Refining Process
Journal Article · Tue Nov 06 23:00:00 EST 2018 · IEEE Translations on Industrial Electronics · OSTI ID:1482213