## Online Learning with Kernels (2003)

### Cached

### Download Links

- [mlg.anu.edu.au]
- [omega.albany.edu:8008]
- [axiom.anu.edu.au]
- [users.cecs.anu.edu.au]
- [www-2.cs.cmu.edu]
- [books.nips.cc]
- [mlg.anu.edu.au]
- DBLP

### Other Repositories/Bibliography

Citations: | 2027 - 128 self |

### BibTeX

@MISC{Kivinen03onlinelearning,

author = {Jyrki Kivinen and Alexander J. Smola and Robert C. Williamson},

title = {Online Learning with Kernels},

year = {2003}

}

### Years of Citing Articles

### OpenURL

### Abstract

Kernel based algorithms such as support vector machines have achieved considerable success in various problems in the batch setting where all of the training data is available in advance. Support vector machines combine the so-called kernel trick with the large margin idea. There has been little use of these methods in an online setting suitable for real-time applications. In this paper we consider online learning in a Reproducing Kernel Hilbert Space. By considering classical stochastic gradient descent within a feature space, and the use of some straightforward tricks, we develop simple and computationally efficient algorithms for a wide range of problems such as classification, regression, and novelty detection. In addition to allowing the exploitation of the kernel trick in an online setting, we examine the value of large margins for classification in the online setting with a drifting target. We derive worst case loss bounds and moreover we show the convergence of the hypothesis to the minimiser of the regularised risk functional. We present some experimental results that support the theory as well as illustrating the power of the new algorithms for online novelty detection. In addition