Results 1 
5 of
5
New Support Vector Algorithms
, 2000
"... this article with the regression case. To explain this, we will introduce a suitable definition of a margin that is maximized in both cases ..."
Abstract

Cited by 335 (43 self)
 Add to MetaCart
this article with the regression case. To explain this, we will introduce a suitable definition of a margin that is maximized in both cases
Shrinking the Tube: A New Support Vector Regression Algorithm
, 1999
"... A new algorithm for Support Vector regression is described. For a priori chosen , it automatically adjusts a flexible tube of minimal radius to the data such that at most a fraction of the data points lie outside. Moreover, it is shown how to use parametric tube shapes with nonconstant radius. ..."
Abstract

Cited by 48 (5 self)
 Add to MetaCart
A new algorithm for Support Vector regression is described. For a priori chosen , it automatically adjusts a flexible tube of minimal radius to the data such that at most a fraction of the data points lie outside. Moreover, it is shown how to use parametric tube shapes with nonconstant radius. The algorithm is analysed theoretically and experimentally.
Support Vector Regression with Automatic Accuracy Control
"... A new algorithm for Support Vector regression is proposed. For a priori chosen, it automatically adjusts a flexible tube of minimal radius to the data such that at most a fraction of the data points lie outside. The algorithm is analysed theoretically and experimentally. ..."
Abstract

Cited by 33 (3 self)
 Add to MetaCart
A new algorithm for Support Vector regression is proposed. For a priori chosen, it automatically adjusts a flexible tube of minimal radius to the data such that at most a fraction of the data points lie outside. The algorithm is analysed theoretically and experimentally.
Linear Programs for Automatic Accuracy Control in Regression
 IN NINTH INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS, CONFERENCE PUBLICATIONS NO. 470
, 1999
"... We have recently proposed a new approach to control the number of basis functions and the accuracy in Support Vector Machines. The latter is transferred to a linear programming setting, which inherently enforces sparseness of the solution. The algorithm computes a nonlinear estimate in terms of ker ..."
Abstract

Cited by 31 (4 self)
 Add to MetaCart
We have recently proposed a new approach to control the number of basis functions and the accuracy in Support Vector Machines. The latter is transferred to a linear programming setting, which inherently enforces sparseness of the solution. The algorithm computes a nonlinear estimate in terms of kernel functions and an ffl ? 0 with the property that at most a fraction of the training set has an error exceeding ffl. The algorithm is robust to local perturbations of these points' target values. We give an explicit formulation of the optimization equations needed to solve the linear program and point out which modifications of the standard optimization setting are necessary to take advantage of the particular structure of the equations in the regression case.
Shrinking the Tube: A New Support Vector Regression Algorithm
"... A new algorithm for Support Vector regression is described. For a priori chosen, it automatically adjusts a flexible tube of minimal radius to the data such that at most a fraction of the data points lie outside. Moreover, it is shown how to use parametric tube shapes with nonconstant radius. The a ..."
Abstract
 Add to MetaCart
A new algorithm for Support Vector regression is described. For a priori chosen, it automatically adjusts a flexible tube of minimal radius to the data such that at most a fraction of the data points lie outside. Moreover, it is shown how to use parametric tube shapes with nonconstant radius. The algorithm is analysed theoretically and experimentally. 1