Results 1  10
of
779
A Comparison of Methods for Multiclass Support Vector Machines
 IEEE TRANS. NEURAL NETWORKS
, 2002
"... Support vector machines (SVMs) were originally designed for binary classification. How to effectively extend it for multiclass classification is still an ongoing research issue. Several methods have been proposed where typically we construct a multiclass classifier by combining several binary class ..."
Abstract

Cited by 952 (22 self)
 Add to MetaCart
Support vector machines (SVMs) were originally designed for binary classification. How to effectively extend it for multiclass classification is still an ongoing research issue. Several methods have been proposed where typically we construct a multiclass classifier by combining several binary
Using the Nyström Method to Speed Up Kernel Machines
 Advances in Neural Information Processing Systems 13
, 2001
"... A major problem for kernelbased predictors (such as Support Vector Machines and Gaussian processes) is that the amount of computation required to find the solution scales as O(n ), where n is the number of training examples. We show that an approximation to the eigendecomposition of the Gram matrix ..."
Abstract

Cited by 434 (6 self)
 Add to MetaCart
A major problem for kernelbased predictors (such as Support Vector Machines and Gaussian processes) is that the amount of computation required to find the solution scales as O(n ), where n is the number of training examples. We show that an approximation to the eigendecomposition of the Gram
Svmknn: Discriminative nearest neighbor classification for visual category recognition
 in CVPR
, 2006
"... We consider visual category recognition in the framework of measuring similarities, or equivalently perceptual distances, to prototype examples of categories. This approach is quite flexible, and permits recognition based on color, texture, and particularly shape, in a homogeneous framework. While n ..."
Abstract

Cited by 342 (10 self)
 Add to MetaCart
nearest neighbor classifiers are natural in this setting, they suffer from the problem of high variance (in biasvariance decomposition) in the case of limited sampling. Alternatively, one could use support vector machines but they involve timeconsuming optimization and computation of pairwise distances
Classification by pairwise coupling
, 1998
"... We discuss a strategy for polychotomous classification that involves estimating class probabilities for each pair of classes, and then coupling the estimates together. The coupling model is similar to the BradleyTerry method for paired comparisons. We study the nature of the class probability estim ..."
Abstract

Cited by 378 (0 self)
 Add to MetaCart
estimates that arise, and examine the performance of the procedure in real and simulated datasets. Classifiers used include linear discriminants, nearest neighbors, and the support vector machine.
Adaptive Nearest Neighbor Classification using Support Vector Machines
, 2001
"... The nearest neighbor technique is a simple and appealing method to address classification problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a finite number of examples due to the curse of dimensionality ..."
Abstract

Cited by 44 (1 self)
 Add to MetaCart
of dimensionality. We propose a technique that computes a locally flexible metric by means of Support Vector Machines (SVMs). The maximum margin boundary found by the SVM is used to determine the most discriminant direction over the query's neighborhood. Such direction provides a local weighting scheme
Classification of hyperspectral remote sensing images with support vector machines
 IEEE Trans. Geosci. Remote Sens
, 2004
"... Abstract—This paper addresses the problem of the classification of hyperspectral remote sensing images by support vector machines (SVMs). First, we propose a theoretical discussion and experimental analysis aimed at understanding and assessing the potentialities of SVM classifiers in hyperdimension ..."
Abstract

Cited by 188 (5 self)
 Add to MetaCart
Abstract—This paper addresses the problem of the classification of hyperspectral remote sensing images by support vector machines (SVMs). First, we propose a theoretical discussion and experimental analysis aimed at understanding and assessing the potentialities of SVM classifiers
Lagrangian Support Vector Machines
, 2000
"... An implicit Lagrangian for the dual of a simple reformulation of the standard quadratic program of a linear support vector machine is proposed. This leads to the minimization of an unconstrained differentiable convex function in a space of dimensionality equal to the number of classified points. Thi ..."
Abstract

Cited by 110 (11 self)
 Add to MetaCart
An implicit Lagrangian for the dual of a simple reformulation of the standard quadratic program of a linear support vector machine is proposed. This leads to the minimization of an unconstrained differentiable convex function in a space of dimensionality equal to the number of classified points
Learning methods for generic object recognition with invariance to pose and lighting
 In Proceedings of CVPR’04
, 2004
"... We assess the applicability of several popular learning methods for the problem of recognizing generic visual categories with invariance to pose, lighting, and surrounding clutter. A large dataset comprising stereo image pairs of 50 uniformcolored toys under 36 angles, 9 azimuths, and 6 lighting co ..."
Abstract

Cited by 253 (18 self)
 Add to MetaCart
of the objects with various amounts of variability and surrounding clutter were used for training and testing. Nearest Neighbor methods, Support Vector Machines, and Convolutional Networks, operating on raw pixels or on PCAderived features were tested. Test error rates for unseen object instances placed
Improvement of nearestneighbor classifier via support vector machines
 In Proceeding of the Fourteenth FLAIRS conference
, 2001
"... Abstract Theoretically wellfounded, Support Vector Machines (SVM) are wellknown to be suited for efficiently solving classification problems. Although improved generalization is the main goal of this new type of learning machine, recent works have tried to use them differently. For instance, feat ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract Theoretically wellfounded, Support Vector Machines (SVM) are wellknown to be suited for efficiently solving classification problems. Although improved generalization is the main goal of this new type of learning machine, recent works have tried to use them differently. For instance
Nearest Neighbor Classification Using Bottomk Sketches
"... Abstract—Bottomk sketches are an alternative to k×minwise sketches when using hashing to estimate the similarity of documents represented by shingles (or set similarity in general) in largescale machine learning. They are faster to compute and have nicer theoretical properties. In the case of k×mi ..."
Abstract
 Add to MetaCart
indicate that a nearest neighbors classifier with bottomk sketches can be preferable to using a linear SVM and bbit k×minwise hashing if the amount of training data is low or the number of features is high. Keywordslargescale machine learning; hashing; nearest neighbor classification; set similarity
Results 1  10
of
779