Results 1 - 10
of
384
Support Vector Machines for Classification and Regression
- UNIVERSITY OF SOUTHAMPTON, TECHNICAL REPORT
, 1998
"... The problem of empirical data modelling is germane to many engineering applications.
In empirical data modelling a process of induction is used to build up a model of the
system, from which it is hoped to deduce responses of the system that have yet to be observed.
Ultimately the quantity and qualit ..."
Abstract
-
Cited by 357 (5 self)
- Add to MetaCart
for parameter selection and the statistical measures used
to select the ’best’ model. The foundations of Support Vector Machines (SVM) have
been developed by Vapnik (1995) and are gaining popularity due to many attractive
features, and promising empirical performance. The formulation embodies the Structural
Cutting-Plane Training of Structural SVMs
, 2007
"... Discriminative training approaches like structural SVMs have shown much promise for building highly complex and accurate models in areas like natural language processing, protein structure prediction, and information retrieval. However, current training algorithms are computationally expensive or i ..."
Abstract
-
Cited by 321 (10 self)
- Add to MetaCart
tagging, and CFG parsing. The experiments show that the cutting-plane algorithm is broadly applicable and fast in practice. On large datasets, it is typically several orders of magnitude faster than conventional training methods derived from decomposition methods like SVM light, or conventional cutting
A support vector method for multivariate performance measures
- Proceedings of the 22nd International Conference on Machine Learning
, 2005
"... This paper presents a Support Vector Method for optimizing multivariate nonlinear performance measures like the F1score. Taking a multivariate prediction approach, we give an algorithm with which such multivariate SVMs can be trained in polynomial time for large classes of potentially non-linear per ..."
Abstract
-
Cited by 305 (6 self)
- Add to MetaCart
-linear performance measures, in particular ROCArea and all measures that can be computed from the contingency table. The conventional classification SVM arises as a special case of our method. 1.
Adapting ranking SVM to document retrieval
- In Proceedings of the 29th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval
, 2006
"... The paper is concerned with applying learning to rank to document retrieval. Ranking SVM is a typical method of learning to rank. We point out that there are two factors one must consider when applying Ranking SVM, in general a “learning to rank” method, to document retrieval. First, correctly ranki ..."
Abstract
-
Cited by 124 (21 self)
- Add to MetaCart
large number of relevant documents. Previously, when existing methods that include Ranking SVM were applied to document retrieval, none of the two factors was taken into consideration. We show it is possible to make modifications in conventional Ranking SVM, so it can be better used for document
A Simple SVM Algorithm
- In Proceedings 2002 International Joint Conference on Neural Networks. IJCNN ’02
, 2002
"... Abstract- We present a fast iterative algorithm for identifying the Support Vectors of a given set of points. Our algorithm works by maintaining a candidate Support Vector set. It uses a greedy approach to pick points for inclusion in the candidate set. When the addition of a point to the candidate ..."
Abstract
-
Cited by 16 (2 self)
- Add to MetaCart
to other conventional iterative algorithms like SMO and the NPA. We present results on a variety of real life datasets to validate our claims. I.
SVM-Based Speaker Verification
"... Abstract — We investigate an alternative formulation of phonetic feature representations for SVM-based speaker verification. The new features are based on conditional likelihood representations rather than the joint-likelihood or bag-of-ngram calculations traditionally used. Conditional likelihoods ..."
Abstract
- Add to MetaCart
Abstract — We investigate an alternative formulation of phonetic feature representations for SVM-based speaker verification. The new features are based on conditional likelihood representations rather than the joint-likelihood or bag-of-ngram calculations traditionally used. Conditional likelihoods
Evaluation of SVM Kernels and Conventional Machine Learning Algorithms for Speaker Identification
"... One of the central problems in the study of Support vector machine (SVM) is kernel selection, that’s based essentially on the problem of choosing a kernel function for a particular task and dataset. By contradiction to other machine learning algorithms, SVM focuses on maximizing the generalisation a ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
One of the central problems in the study of Support vector machine (SVM) is kernel selection, that’s based essentially on the problem of choosing a kernel function for a particular task and dataset. By contradiction to other machine learning algorithms, SVM focuses on maximizing the generalisation
Classification of hyperspectral remote sensing images with support vector machines
- IEEE Trans. Geosci. Remote Sens
, 2004
"... Abstract—This paper addresses the problem of the classifica-tion of hyperspectral remote sensing images by support vector machines (SVMs). First, we propose a theoretical discussion and experimental analysis aimed at understanding and assessing the potentialities of SVM classifiers in hyperdimension ..."
Abstract
-
Cited by 188 (5 self)
- Add to MetaCart
Abstract—This paper addresses the problem of the classifica-tion of hyperspectral remote sensing images by support vector machines (SVMs). First, we propose a theoretical discussion and experimental analysis aimed at understanding and assessing the potentialities of SVM classifiers
Explanation-augmented svm: an approach to incorporating domain knowledge into svm learning
- In Proceedings of The Twenty Second International Conference on Machine Learning
, 2005
"... We introduce a novel approach to incorporating domain knowledge into Support Vector Machines to improve their example efficiency. Domain knowledge is used in an Explanation Based Learning fashion to build justifications or explanations for why the training examples are assigned their given class lab ..."
Abstract
-
Cited by 5 (2 self)
- Add to MetaCart
labels. Explanations bias the large margin classifier through the interaction of training examples and domain knowledge. We develop a new learning algorithm for this Explanation-Augmented SVM (EA-SVM). It naturally extends to imperfect knowledge, a stumbling block to conventional EBL. Experimental
Support Vector Classifier with Asymmetric Kernel Functions
- in European Symposium on Artificial Neural Networks (ESANN
, 1999
"... In support vector classifier, asymmetric kernel functions are not used so far, although they are frequently used in other kernel classifiers. The applicable kernels are limited to symmetric semipositive definite ones because of Mercer's theorem. In this paper, SVM is extended to be applicab ..."
Abstract
-
Cited by 22 (0 self)
- Add to MetaCart
to be applicable to asymmetric kernel functions. It is proven that, when a positive definite kernel is given, the extended SVM is identical with the conventional SVM. In the 3D object recognition experiment, the extended SVM with asymmetric kernels performed better than the conventional SVM.
Results 1 - 10
of
384