Results 1  10
of
389
Using the Nyström Method to Speed Up Kernel Machines
 Advances in Neural Information Processing Systems 13
, 2001
"... A major problem for kernelbased predictors (such as Support Vector Machines and Gaussian processes) is that the amount of computation required to find the solution scales as O(n ), where n is the number of training examples. We show that an approximation to the eigendecomposition of the Gram matrix ..."
Abstract

Cited by 434 (6 self)
 Add to MetaCart
A major problem for kernelbased predictors (such as Support Vector Machines and Gaussian processes) is that the amount of computation required to find the solution scales as O(n ), where n is the number of training examples. We show that an approximation to the eigendecomposition of the Gram
Online learning for matrix factorization and sparse coding
, 2010
"... Sparse coding—that is, modelling data vectors as sparse linear combinations of basis elements—is widely used in machine learning, neuroscience, signal processing, and statistics. This paper focuses on the largescale matrix factorization problem that consists of learning the basis set in order to ad ..."
Abstract

Cited by 330 (31 self)
 Add to MetaCart
Sparse coding—that is, modelling data vectors as sparse linear combinations of basis elements—is widely used in machine learning, neuroscience, signal processing, and statistics. This paper focuses on the largescale matrix factorization problem that consists of learning the basis set in order
Improving the Accuracy and Speed of Support Vector Machines
 Advances in Neural Information Processing Systems 9
, 1997
"... Support Vector Learning Machines (SVM) are finding application in pattern recognition, regression estimation, and operator inversion for illposed problems. Against this very general backdrop, any methods for improving the generalization performance, or for improving the speed in test phase, of SVMs ..."
Abstract

Cited by 192 (23 self)
 Add to MetaCart
in the error rate on 10,000 NIST test digit images of 1.4% to 1.0%. The method for improving the speed (the "reduced set" method) does so by approximating the support vector decision surface. We apply this method to achieve a factor of fifty speedup in test phase over the virtual support vector
Kronecker factorization for speeding up kernel machines
 In SIAM International Conference on Data Mining (SDM
, 2005
"... In kernel machines, such as kernel principal component analysis (KPCA), Gaussian Processes (GPs), and Support Vector Machines (SVMs), the computational complexity of finding a solution is O(n 3), where n is the number of training instances. To reduce this expensive computational complexity, we propo ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
propose using Kronecker factorization, which approximates a positive definite kernel matrix by the Kronecker product of two smaller positive definite matrices. This approximation can speed up the calculation of the kernelmatrix inverse or eigendecomposition involved in kernel machines. When the two
Incremental Support Vector Learning: Analysis, Implementation and Applications
 Journal of Machine Learning Research
, 1968
"... Incremental Support Vector Machines (SVM) are instrumental in practical applications of online learning. This work focuses on the design and analysis of efficient incremental SVM learning, with the aim of providing a fast, numerically stable and robust implementation. A detailed analysis of converge ..."
Abstract

Cited by 43 (5 self)
 Add to MetaCart
Incremental Support Vector Machines (SVM) are instrumental in practical applications of online learning. This work focuses on the design and analysis of efficient incremental SVM learning, with the aim of providing a fast, numerically stable and robust implementation. A detailed analysis
On approximate solutions to support vector machines
 In SIAMDM
, 2006
"... We propose to speed up the training process of support vector machines (SVM) by resorting to an approximate SVM, where a small number of representatives are extracted from the original training data set and used for training. Theoretical studies show that, in order for the approximate SVM to be simi ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
We propose to speed up the training process of support vector machines (SVM) by resorting to an approximate SVM, where a small number of representatives are extracted from the original training data set and used for training. Theoretical studies show that, in order for the approximate SVM
A PAC Bound for Approximate Support Vector Machines
, 2007
"... We study a class of algorithms that speed up the training process of support vector machines (SVMs) by returning an approximate SVM. We focus on algorithms that reduce the size of the optimization problem by extracting from the original training dataset a small number of representatives and using th ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We study a class of algorithms that speed up the training process of support vector machines (SVMs) by returning an approximate SVM. We focus on algorithms that reduce the size of the optimization problem by extracting from the original training dataset a small number of representatives and using
Speeding up the solution of multilabel problems With Support Vector Machines
, 2004
"... The classical SVM approach to solve multilabel problems consists in training a single classifier for each class. We propose a compact model that considers the whole set of classifiers at once. Our strategy focuses on the shared use of the kernel matrix information between different classifiers in or ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The classical SVM approach to solve multilabel problems consists in training a single classifier for each class. We propose a compact model that considers the whole set of classifiers at once. Our strategy focuses on the shared use of the kernel matrix information between different classifiers
Abstract On Approximate Solutions to Support Vector Machines ∗
"... We propose to speed up the training process of support vector machines (SVM) by resorting to an approximate SVM, where a small number of representatives are extracted from the original training data set and used for training. Theoretical studies show that, in order for the approximate SVM to be simi ..."
Abstract
 Add to MetaCart
We propose to speed up the training process of support vector machines (SVM) by resorting to an approximate SVM, where a small number of representatives are extracted from the original training data set and used for training. Theoretical studies show that, in order for the approximate SVM
Abstract On Approximate Solutions to Support Vector Machines ∗
"... We propose to speed up the training process of support vector machines (SVM) by resorting to an approximate SVM, where a small number of representatives are extracted from the original training data set and used for training. Theoretical studies show that, in order for the approximate SVM to be simi ..."
Abstract
 Add to MetaCart
We propose to speed up the training process of support vector machines (SVM) by resorting to an approximate SVM, where a small number of representatives are extracted from the original training data set and used for training. Theoretical studies show that, in order for the approximate SVM
Results 1  10
of
389