Results 1  10
of
817,238
The Nature of Statistical Learning Theory
, 1999
"... Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based on the deve ..."
Abstract

Cited by 12976 (32 self)
 Add to MetaCart
Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based
Benchmarking Least Squares Support Vector Machine Classifiers
 NEURAL PROCESSING LETTERS
, 2001
"... In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set of eq ..."
Abstract

Cited by 446 (46 self)
 Add to MetaCart
In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set
Proximal support vector machine classifiers
 Proceedings KDD2001: Knowledge Discovery and Data Mining
, 2001
"... Abstract—A new approach to support vector machine (SVM) classification is proposed wherein each of two data sets are proximal to one of two distinct planes that are not parallel to each other. Each plane is generated such that it is closest to one of the two data sets and as far as possible from the ..."
Abstract

Cited by 152 (16 self)
 Add to MetaCart
Abstract—A new approach to support vector machine (SVM) classification is proposed wherein each of two data sets are proximal to one of two distinct planes that are not parallel to each other. Each plane is generated such that it is closest to one of the two data sets and as far as possible from
KnowledgeBased Support Vector Machine Classifiers
 In Advances in Neural Information Processing Systems 14
, 2002
"... Prior knowledge in the form of multiple polyhedral sets, each belonging to one of two categories, is introduced into a reformulation of a linear support vector machine classifier. The resulting formulation leads to a linear program that can be solved efficiently. Real world examples, from DNA sequen ..."
Abstract

Cited by 48 (12 self)
 Add to MetaCart
Prior knowledge in the form of multiple polyhedral sets, each belonging to one of two categories, is introduced into a reformulation of a linear support vector machine classifier. The resulting formulation leads to a linear program that can be solved efficiently. Real world examples, from DNA
Data Selection for Support Vector Machine Classifiers
 In Proceedings of the Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
, 2000
"... The problem of extracting a minimal number of data points from a large dataset, in order to generate a support vector machine (SVM) classifier, is formulated as a concave min imization problem and solved by a finite number of linear programs. This minimal set of data points, which is the smallest n ..."
Abstract

Cited by 32 (4 self)
 Add to MetaCart
The problem of extracting a minimal number of data points from a large dataset, in order to generate a support vector machine (SVM) classifier, is formulated as a concave min imization problem and solved by a finite number of linear programs. This minimal set of data points, which is the smallest
Moderating the Outputs of Support Vector Machine Classifiers
 IEEE Transactions on Neural Networks
, 1999
"...  In this paper, we extend the use of moderated outputs to the support vector machine (SVM) by making use of a relationship between SVM and the evidence framework. The moderated output is more in line with the Bayesian idea that the posterior weight distribution should be taken into account upon pre ..."
Abstract

Cited by 55 (3 self)
 Add to MetaCart
 In this paper, we extend the use of moderated outputs to the support vector machine (SVM) by making use of a relationship between SVM and the evidence framework. The moderated output is more in line with the Bayesian idea that the posterior weight distribution should be taken into account upon
Multicategory proximal support vector machine classifiers
 Machine Learning
, 2001
"... Abstract. Given a dataset, each element of which labeled by one of k labels, we construct by a very fast algorithm, a kcategory proximal support vector machine (PSVM) classifier. Proximal support vector machines and related approaches (Fung & Mangasarian, 2001; Suykens & Vandewalle, 1999) c ..."
Abstract

Cited by 29 (0 self)
 Add to MetaCart
Abstract. Given a dataset, each element of which labeled by one of k labels, we construct by a very fast algorithm, a kcategory proximal support vector machine (PSVM) classifier. Proximal support vector machines and related approaches (Fung & Mangasarian, 2001; Suykens & Vandewalle, 1999
Improving Support Vector Machine Classifiers by Modifying Kernel Functions
 NEURAL NETWORKS
, 1999
"... We propose a method of modifying a kernel function to improve the performance of a support vector machine classifier. This is based on the Riemannian geometrical structure induced by the kernel function. The idea is to enlarge the spatial resolution around the separating boundary surface by a con ..."
Abstract

Cited by 88 (3 self)
 Add to MetaCart
We propose a method of modifying a kernel function to improve the performance of a support vector machine classifier. This is based on the Riemannian geometrical structure induced by the kernel function. The idea is to enlarge the spatial resolution around the separating boundary surface by a
Support Vector Machine Classifiers as Applied to AVIRIS Data
, 1999
"... INTRODUCTION The Support Vector Machine #SVM# is a relatively recent approachintroduced by Boser, Guyon, and Vapnik #Boser et al., 1992#, #Vapnik, 1995# for solving supervised classi#cation and regression problems, or more colloquially learning from examples. In the following we will discuss only c ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
INTRODUCTION The Support Vector Machine #SVM# is a relatively recent approachintroduced by Boser, Guyon, and Vapnik #Boser et al., 1992#, #Vapnik, 1995# for solving supervised classi#cation and regression problems, or more colloquially learning from examples. In the following we will discuss only
Handling Missing Values in Support Vector Machine Classifiers
"... Abstract — This paper discusses the task of learning a classifier from observed data containing missing values amongst the inputs which are missing completely at random 1. A nonparametric perspective is adopted by defining a modified risk taking into account the uncertainty of the predicted outputs ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
to the multivariate case of fitting additive models using componentwise kernel machines, and an efficient implementation is based on the Least Squares Support Vector Machine (LSSVM) classifier formulation.
Results 1  10
of
817,238