Results 1  10
of
14
A tutorial on support vector machines for pattern recognition
 Data Mining and Knowledge Discovery
, 1998
"... The tutorial starts with an overview of the concepts of VC dimension and structural risk minimization. We then describe linear Support Vector Machines (SVMs) for separable and nonseparable data, working through a nontrivial example in detail. We describe a mechanical analogy, and discuss when SV ..."
Abstract

Cited by 3319 (12 self)
 Add to MetaCart
(Show Context)
The tutorial starts with an overview of the concepts of VC dimension and structural risk minimization. We then describe linear Support Vector Machines (SVMs) for separable and nonseparable data, working through a nontrivial example in detail. We describe a mechanical analogy, and discuss when SVM solutions are unique and when they are global. We describe how support vector training can be practically implemented, and discuss in detail the kernel mapping technique which is used to construct SVM solutions which are nonlinear in the data. We show how Support Vector machines can have very large (even infinite) VC dimension by computing the VC dimension for homogeneous polynomial and Gaussian radial basis function kernels. While very high VC dimension would normally bode ill for generalization performance, and while at present there exists no theory which shows that good generalization performance is guaranteed for SVMs, there are several arguments which support the observed high accuracy of SVMs, which we review. Results of some experiments which were inspired by these arguments are also presented. We give numerous examples and proofs of most of the key theorems. There is new material, and I hope that the reader will find that even old material is cast in a fresh light.
ATOMIC DECOMPOSITION BY BASIS PURSUIT
, 1995
"... The TimeFrequency and TimeScale communities have recently developed a large number of overcomplete waveform dictionaries  stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods for d ..."
Abstract

Cited by 2731 (61 self)
 Add to MetaCart
(Show Context)
The TimeFrequency and TimeScale communities have recently developed a large number of overcomplete waveform dictionaries  stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods for decomposition have been proposed, including the Method of Frames (MOF), Matching Pursuit (MP), and, for special dictionaries, the Best Orthogonal Basis (BOB). Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l 1 norm of coefficients among all such decompositions. We give examples exhibiting several advantages over MOF, MP and BOB, including better sparsity, and superresolution. BP has interesting relations to ideas in areas as diverse as illposed problems, in abstract harmonic analysis, total variation denoising, and multiscale edge denoising. Basis Pursuit in highly overcomplete dictionaries leads to largescale optimization problems. With signals of length 8192 and a wavelet packet dictionary, one gets an equivalent linear program of size 8192 by 212,992. Such problems can be attacked successfully only because of recent advances in linear programming by interiorpoint methods. We obtain reasonable success with a primaldual logarithmic barrier method and conjugategradient solver.
An introduction to kernelbased learning algorithms
 IEEE TRANSACTIONS ON NEURAL NETWORKS
, 2001
"... This paper provides an introduction to support vector machines (SVMs), kernel Fisher discriminant analysis, and ..."
Abstract

Cited by 589 (54 self)
 Add to MetaCart
This paper provides an introduction to support vector machines (SVMs), kernel Fisher discriminant analysis, and
LOQO: An interior point code for quadratic programming
, 1994
"... ABSTRACT. This paper describes a software package, called LOQO, which implements a primaldual interiorpoint method for general nonlinear programming. We focus in this paper mainly on the algorithm as it applies to linear and quadratic programming with only brief mention of the extensions to convex ..."
Abstract

Cited by 191 (10 self)
 Add to MetaCart
ABSTRACT. This paper describes a software package, called LOQO, which implements a primaldual interiorpoint method for general nonlinear programming. We focus in this paper mainly on the algorithm as it applies to linear and quadratic programming with only brief mention of the extensions to convex and general nonlinear programming, since a detailed paper describing these extensions were published recently elsewhere. In particular, we emphasize the importance of establishing and maintaining symmetric quasidefiniteness of the reduced KKT system. We show that the industry standard MPS format can be nicely formulated in such a way to provide quasidefiniteness. Computational results are included for a variety of linear and quadratic programming problems. 1.
A novel method of protein secondary structure prediction with high segment overlap measure: support vector machine approach
 J MOL BIOL
, 2001
"... We have introduced a new method of protein secondary structure prediction which is based on the theory of support vector machine (SVM). SVM represents a new approach to supervised pattern classification which has been successfully applied to a wide range of pattern recognition problems, including ob ..."
Abstract

Cited by 175 (3 self)
 Add to MetaCart
We have introduced a new method of protein secondary structure prediction which is based on the theory of support vector machine (SVM). SVM represents a new approach to supervised pattern classification which has been successfully applied to a wide range of pattern recognition problems, including object recognition, speaker identification, gene function prediction with microarray expression profile, etc. In these cases, the performance of SVM either matches or is significantly better than that of traditional machine learning approaches, including neural networks. The first use of the SVM approach to predict protein secondary structure is described here. Unlike the previous studies, we first constructed several binary classifiers, then assembled a tertiary classifier for three secondary structure states (helix, sheet and coil) based on these binary classifiers. The SVM method achieved a good performance of segment overlap accuracy SOV = 76.2 % through sevenfold cross validation on a database of 513 nonhomologous protein chains with multiple sequence alignments, which outperforms existing methods. Meanwhile threestate overall perresidue accuracy Q 3 achieved 73.5 %, which is at least comparable to existing single prediction methods. Furthermore a useful "reliability index" for the predictions was developed. In addition, SVM has many attractive features, including effective avoidance of overfitting, the ability to handle large feature spaces, information condensing of the given data set, etc. The SVM method is conveniently applied to many other pattern classification tasks in biology.
Support vector machine for regression and applications to financial forecasting
 In Proceedings of the IEEEINNSENNS International Joint Conference on Neural Networks (IJCNN2000
, 2000
"... The main purpose of this paper is to compare the support vector machine (SVM) developed by Vapnik with other techniques such as Backpropagation and Radial Basis Function (RBF) Networks for financial forecasting applications. The theory of the SVM algorithm is based on statistical learning theory. Tr ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
(Show Context)
The main purpose of this paper is to compare the support vector machine (SVM) developed by Vapnik with other techniques such as Backpropagation and Radial Basis Function (RBF) Networks for financial forecasting applications. The theory of the SVM algorithm is based on statistical learning theory. Training of SVMs leads to a quadratic programming (QP) problem. Preliminary computational results for stock price prediction are also presented. 1.
Interior Point Algorithms for Integer Programming
, 1994
"... Research on using interior point algorithms to solve integer programming problems is surveyed. This paper concentrates on branch and bound and cutting plane methods; a potential function method is also briefly mentioned. The principal difficulty with using an interior point algorithm in a branch and ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
Research on using interior point algorithms to solve integer programming problems is surveyed. This paper concentrates on branch and bound and cutting plane methods; a potential function method is also briefly mentioned. The principal difficulty with using an interior point algorithm in a branch and cut method to solve integer programming problems is in warm starting the algorithm efficiently. Methods for overcoming this difficulty are described and other features of the algorithms are given. This paper focuses on the techniques necessary to obtain an efficient computational implementation; there is a short discussion of theoretical issues.
An InteriorPoint Method for General LargeScale Quadratic Programming Problems
 Annals of Operations Research
, 1996
"... In this paper we present an interior point algorithm for solving both convex and nonconvex quadratic programs. The method, which is an extension of our interior point work on linear programming problems, efficiently solves a wide class of large scale problems and forms the basis for a sequential qua ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
In this paper we present an interior point algorithm for solving both convex and nonconvex quadratic programs. The method, which is an extension of our interior point work on linear programming problems, efficiently solves a wide class of large scale problems and forms the basis for a sequential quadratic programming (SQP) solver for general large scale nonlinear programs. The key to the algorithm is a 3dimensional costimprovement subproblem, which is solved at every iteration. We have developed an approximate recentering procedure and a novel, adaptive bigM Phase I procedure that are essential to the success. We describe the basic method along with the recentering and bigM Phase I procedures. Details of the implementation and computational results are also presented. Keywords: bigM Phase I procedure, convex quadratic programming, interior point methods, linear programming, method of centers, multidirectional search direction, nonconvex quadratic programming, recentering. # Cont...