文档介绍:A Tutorial on Support Vector Regression
†‡
Alex J. Smola and Bernhard Sch¨olkopf
September 30, 2003
Abstract As such, it is firmly grounded in the framework of statistical
learning theory, or VC theory, which has been developed over
In this tutorial we give an overview of the basic ideas under- the last three decades by Vapnik and Chervonenkis [1974],
lying Support Vector (SV) machines for function estimation. Vapnik [1982, 1995]. In a nutshell, VC theory characterizes
Furthermore, we include a summary of currently used algo- properties of learning machines which enable them to gener-
rithms for training SV machines, covering both the quadratic alize well to unseen data.
(or convex) programming part and advanced methods for In its present form, the SV machine was largely devel-
dealing with large datasets. Finally, we mention some modifi- oped at AT&T Bell Laboratories by Vapnik and co-workers
cations and extensions that have been applied to the standard [Boser et al., 1992, Guyon et al., 1993, Cortes and Vapnik, 1995,
SV algorithm, and discuss the aspect of regularization from a Sch¨olkopf et al., 1995, Sch¨olkopf et al., 1996, Vapnik et al.,
SV perspective. 1997]. Due to this industrial context, SV research has up to
date had a sound orientation towards real-world applications.
Initial work focused on OCR (optical character recognition).
1 Introduction Within a short period of time, SV classifiers peti-
tive with the best available systems for both OCR and object
The purpose of this paper is twofold. It should serve as a self-
recognition tasks [Sch¨olkopf et al., 1996, 1998a, Blanz et al.,
contained introduction to Support Vector regression for read-
1996, Sch¨olkopf, 1997]. prehensive tutorial on SV clas-
ers new to this rapidly developing field of On the
sifiers has been published by Burges [1998]. But also in re-
other hand, it attempts to give an overview of recent develop-
gression and time series prediction applic