Results 21  30
of
39
Support vector machines: Theory and applications
 In Machine Learning and Its Applications. Advanced Lectures
, 2001
"... This paper presents a summary of the issues discussed during the one day workshop on "Support Vector Machines (SVM) Theory and Applications " organized as part of the Advanced Course on Artificial Intelligence (ACAI 99) in Chania, Greece. The goal of the paper is twofold: to present an ove ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
This paper presents a summary of the issues discussed during the one day workshop on "Support Vector Machines (SVM) Theory and Applications " organized as part of the Advanced Course on Artificial Intelligence (ACAI 99) in Chania, Greece. The goal of the paper is twofold: to present an overview of the background theory and current understanding of SVM, and to discuss the papers presented as well as the issues that arose during the workshop.
Machine learning, machine vision and the brain
 AI Magazine
, 1999
"... Neuroscience:Models and ..."
Ball Detection in Static Images with Support Vector Machines for Classification
 Image and Vision Computing
, 2003
"... We present a general method for detecting balls in images at the aim of automatically detecting goals during a soccer match. The detector learns the object to detect by using a supervised learning scheme called Support Vector Machines, in which the examples are views of the object. Due to the attitu ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We present a general method for detecting balls in images at the aim of automatically detecting goals during a soccer match. The detector learns the object to detect by using a supervised learning scheme called Support Vector Machines, in which the examples are views of the object. Due to the attitude of the camera with respect to football ground, the system can be thought of as an electronic linesman which helps the referee in establishing the occurrence of a goal during a soccer match. Numerous theoretical and practical issues are addressed in the paper. The first one concerns the determination of negative examples relevant for the problem at hand and the training of a reference classifier in the case of an unbalanced number of positive and negative examples. The second one focuses on the reduction of the computational complexity of the reference classifier during the test phase, without increasing its generalization error. The third issue regards the problem of parameter selection, which is equivalent, in our context, to the problem of selecting, among the classifiers the machine implements, the one having performances similar to the reference classifier. Experimental results on real images show the performances of the proposed detection scheme.
Variable Selection for SVM via Smoothing Spline ANOVA
"... It is wellknown that the support vector machine paradigm is equivalent to solving a regularization problem in a reproducing kernel Hilbert space. The squared norm penalty in the standard support vector machine controls the smoothness of the classification function. We propose, under the framewor ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
It is wellknown that the support vector machine paradigm is equivalent to solving a regularization problem in a reproducing kernel Hilbert space. The squared norm penalty in the standard support vector machine controls the smoothness of the classification function. We propose, under the framework of smoothing spline ANOVA models, a new type of regularization to conduct simultaneous classification and variable selection in the SVM. The penalty functional used is the sum of functional component norms, which automatically applies softthresholding operations to functional components hence yields sparse solutions. We suggest an efficient algorithm to solve the proposed optimization problem by iteratively solving the quadratic programming and linear programming. Numerical studies, on both simulated data and real datasets, show that the modified support vector machine gives very competitive performances compared to other popular classification algorithms, in terms of both classification accuracy and variable selection.
A Note on the Generalization Performance of Kernel Classifiers With Margin
, 1999
"... We present distribution independent bounds on the genera liza tion miscla ssifica tion performa nce ofa fa mily of kernel cla ssifiers withma rgin. Support Vector Ma chine cla ssifiers (SVM) stem out of this cla ss of ma chines. The boundsa re derived through computa tions of the V # dimension ofa ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We present distribution independent bounds on the genera liza tion miscla ssifica tion performa nce ofa fa mily of kernel cla ssifiers withma rgin. Support Vector Ma chine cla ssifiers (SVM) stem out of this cla ss of ma chines. The boundsa re derived through computa tions of the V # dimension ofa fa mily of loss functions where the SVM one belongs to. Bounds tha t use functions of ma rgin distributions (i.e. functions of the sla ck va ria bles of SVM)a re derived.
Bayesian Inference in Support Vector Regression
"... In this paper, we apply popular Bayesian techniques on support vector regression. We describe a Bayesian framework in a functionspace view with a Gaussian process prior probability over the functions. A unified nonquadratic loss function with the desirable characteristic of differentiability, call ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper, we apply popular Bayesian techniques on support vector regression. We describe a Bayesian framework in a functionspace view with a Gaussian process prior probability over the functions. A unified nonquadratic loss function with the desirable characteristic of differentiability, called the soft insensitive loss function, is used in likelihood evaluation. In the framework, maximum a posteriori estimate of the functions results in an extended support vector regression problem. Bayesian methods are used to implement model adaptation, while keeping the merits of support vector regression, such as quadratic programming and sparseness. Moreover, we put forward confidence interval in making predictions. Experimental results on simulated and realworld datasets indicate that the approach works well even on large datasets.
Learning with Kernel Machine Architectures
, 2000
"... This thesis studies the problem of supervised learning using a family of machines, namely kernel learning machines. A number of standard learning methods belong to this family, such as Regularization Networks (RN) and Support Vector Machines (SVM). The thesis presents a theoretical justification of ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This thesis studies the problem of supervised learning using a family of machines, namely kernel learning machines. A number of standard learning methods belong to this family, such as Regularization Networks (RN) and Support Vector Machines (SVM). The thesis presents a theoretical justification of these machines within a unified framework based on the statistical learning theory of Vapnik. The generalization performance of RN and SVM is studied within this framework, and bounds on the generalization error of these machines are proved.
Action Scene Detection with Support Vector Machines
"... Abstract — To entice the target audience into paying to see the full movie, the production of movie trailers is an integral part of movie industry. Action scene is the main component of a movie trailer. In this paper, we propose an automatic action scene detection algorithm based on the analysis of ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract — To entice the target audience into paying to see the full movie, the production of movie trailers is an integral part of movie industry. Action scene is the main component of a movie trailer. In this paper, we propose an automatic action scene detection algorithm based on the analysis of highlevel video structure. The input video is first decomposed into a number of basic components called shots. Then, shots are grouped into semanticrelated scenes by taking into account the visual characteristics and temporal dynamics of video. Based on the filmmaking characteristics of action scene, some features of the scene are extracted to feed into the support vector machine for classification. Compared with related works which integrate visual and audio information, our visualbased approach is computationally simple yet effective. Index Terms — Video content analysis, support vector machine, scene classification I.
in Nonstandard Situations
, 2000
"... The majority of classification algorithms are developed for the standard situation in which it is assumed that the examples in the training set come from the same distribution as that of the target population, and that the cost of misclassification into different classes are the same. However, these ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The majority of classification algorithms are developed for the standard situation in which it is assumed that the examples in the training set come from the same distribution as that of the target population, and that the cost of misclassification into different classes are the same. However, these assumptions are often violated in real world settings. For some classification methods, this can often be taken care of simply with a change of threshold; for others, additional effort is required. In this paper, we explain why the standard support vector machine is not suitable for the nonstandard situation, and introduce a simple procedure for adapting the support vector machine methodology to the nonstandard situation. Theoretical justification for the procedure is provided. Simulation study illustrates that the modified support vector machine significantly improves upon the standard support vector machine in the nonstandard situation. The computational load of the proposed procedure is the same as that of the standard support vector machine. The procedure reduces to the standard support vector machine in the standard situation.
On the V gamma dimension for regression in Reproducing Kernel Hilbert Spaces
 A.i. memo, MIT Artificial Intelligence Lab
, 1999
"... . This paper presents a computation of the V# dimension for regression in bounded subspaces of Reproducing Kernel Hilbert Spaces (RKHS) for the Support Vector Machine (SVM) regression #insensitive loss function L # , and general Lp loss functions. Finiteness of the V# dimension is shown, which al ..."
Abstract
 Add to MetaCart
. This paper presents a computation of the V# dimension for regression in bounded subspaces of Reproducing Kernel Hilbert Spaces (RKHS) for the Support Vector Machine (SVM) regression #insensitive loss function L # , and general Lp loss functions. Finiteness of the V# dimension is shown, which also proves uniform convergence in probability for regression machines in RKHS subspaces that use the L # or general Lp loss functions. This paper presents a novel proof of this result. It also presents a computation of an upper bound of the V# dimension under some conditions, that leads to an approach for the estimation of the empirical V# dimension given a set of training data. 1 Introduction The V # dimension, a variation of the VCdimension [11], is important for the study of learning machines [1, 5]. In this paper we present a computation of the V # dimension of realvalued functions L(y, f(x)) = y  f(x) p and (Vapnik's #insensitive loss function L # [11]) L(y, f(x)) = y  ...