Results 1  10
of
1,555,375
The Nature of Statistical Learning Theory
, 1999
"... Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based on the deve ..."
Abstract

Cited by 12976 (32 self)
 Add to MetaCart
Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based
Benchmarking Least Squares Support Vector Machine Classifiers
 NEURAL PROCESSING LETTERS
, 2001
"... In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set of eq ..."
Abstract

Cited by 446 (46 self)
 Add to MetaCart
In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set
Automatic Relevance Determination for Least Squares Support Vector Machine Classifiers
"... Automatic Revelance Determination (ARD) has been applied to multilayer perceptrons by inferring dierent regularization parameters for the input interconnection layer within the evidence framework. In this paper, ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
Automatic Revelance Determination (ARD) has been applied to multilayer perceptrons by inferring dierent regularization parameters for the input interconnection layer within the evidence framework. In this paper,
Bayesian framework for least squares support vector machine classifiers, Gaussian processes and kernel fisher discriminant analysis
 NEURAL COMPUTATION
, 2002
"... The Bayesian evidence framework has been successfully applied to the design of multilayer perceptrons (MLPs) in the work of MacKay. Nevertheless,the training of MLPs suffers from drawbacks like the nonconvex optimization problem and the choice of the number of hidden units. In Support Vector Machin ..."
Abstract

Cited by 30 (9 self)
 Add to MetaCart
Machines (SVMs) for classification,as introduced by Vapnik,a nonlinear decision boundary is obtained by mapping the input vector first in a nonlinear way to a high dimensional kernelinduced feature space in which a linear large margin classifier is constructed. Practical expressions are formulated
Knowledge Discovery using Least Squares Support Vector Machine Classifiers: a Direct Marketing Case
"... The case involves the detection and qualification of the most relevant predictors for repeatpurchase modelling in a direct marketing setting. Analysis is based on a wrapped form of feature selection using a sensitivity based pruning heuristic to guide a greedy, stepwise and backward traversal of ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
of the input space. For this purpose, we make use of a powerful and promising least squares version (LSSVM) for support vector machine classification. The setup is based upon the standard R(ecency) F(requency) M(onetary) modelling semantics. Results indicate that elimination of redundant
Sparse Least Squares Support Vector Machine Classifiers J.A.K. Suykens, L. Lukas and J. Vandewalle
 Neural Processing Letters
, 2000
"... In least squares support vector machine (LSSVM) classifiers the original SVM formulation of Vapnik is modified by considering equalit y constraints within a form of ridge regression instead of inequality constraints. As a result the solution follows from solving a set of linear equations instead of ..."
Abstract
 Add to MetaCart
In least squares support vector machine (LSSVM) classifiers the original SVM formulation of Vapnik is modified by considering equalit y constraints within a form of ridge regression instead of inequality constraints. As a result the solution follows from solving a set of linear equations instead
LETTER Communicated by John Platt Bayesian Framework for LeastSquares Support Vector Machine Classifiers, Gaussian Processes, and Kernel Fisher Discriminant Analysis
"... johan.suykensesat.kuleuven.ac.be ..."
Training Support Vector Machines: an Application to Face Detection
, 1997
"... We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision sur ..."
Abstract

Cited by 728 (1 self)
 Add to MetaCart
We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision
A Comparison of Methods for Multiclass Support Vector Machines
 IEEE TRANS. NEURAL NETWORKS
, 2002
"... Support vector machines (SVMs) were originally designed for binary classification. How to effectively extend it for multiclass classification is still an ongoing research issue. Several methods have been proposed where typically we construct a multiclass classifier by combining several binary class ..."
Abstract

Cited by 935 (22 self)
 Add to MetaCart
Support vector machines (SVMs) were originally designed for binary classification. How to effectively extend it for multiclass classification is still an ongoing research issue. Several methods have been proposed where typically we construct a multiclass classifier by combining several binary
Knowledgebased Analysis of Microarray Gene Expression Data By Using Support Vector Machines
, 2000
"... We introduce a method of functionally classifying genes by using gene expression data from DNA microarray hybridization experiments. The method is based on the theory of support vector machines (SVMs). SVMs are considered a supervised computer learning method because they exploit prior knowledge of ..."
Abstract

Cited by 514 (8 self)
 Add to MetaCart
We introduce a method of functionally classifying genes by using gene expression data from DNA microarray hybridization experiments. The method is based on the theory of support vector machines (SVMs). SVMs are considered a supervised computer learning method because they exploit prior knowledge
Results 1  10
of
1,555,375