Results 1  10
of
79
A tutorial on support vector regression
, 2004
"... In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing ..."
Abstract

Cited by 473 (2 self)
 Add to MetaCart
In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing with large datasets. Finally, we mention some modifications and extensions that have been applied to the standard SV algorithm, and discuss the aspect of regularization from a SV perspective.
LAGRANGE MULTIPLIERS AND OPTIMALITY
, 1993
"... Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions ..."
Abstract

Cited by 89 (7 self)
 Add to MetaCart
Lagrange multipliers used to be viewed as auxiliary variables introduced in a problem of constrained minimization in order to write firstorder optimality conditions formally as a system of equations. Modern applications, with their emphasis on numerical methods and more complicated side conditions than equations, have demanded deeper understanding of the concept and how it fits into a larger theoretical picture. A major line of research has been the nonsmooth geometry of onesided tangent and normal vectors to the set of points satisfying the given constraints. Another has been the gametheoretic role of multiplier vectors as solutions to a dual problem. Interpretations as generalized derivatives of the optimal value with respect to problem parameters have also been explored. Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows blackandwhite constraints to be replaced by penalty expressions. This paper traces such themes in the current theory of Lagrange multipliers, providing along the way a freestanding exposition of basic nonsmooth analysis as motivated by and applied to this subject.
Filter Pattern Search Algorithms for Mixed Variable Constrained Optimization Problems
 SIAM Journal on Optimization
, 2004
"... A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for gene ..."
Abstract

Cited by 37 (8 self)
 Add to MetaCart
A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for general nonlinear constraints. In generalizing existing algorithms, new theoretical convergence results are presented that reduce seamlessly to existing results for more specific classes of problems. While no local continuity or smoothness assumptions are required to apply the algorithm, a hierarchy of theoretical convergence results based on the Clarke calculus is given, in which local smoothness dictate what can be proved about certain limit points generated by the algorithm. To demonstrate the usefulness of the algorithm, the algorithm is applied to the design of a loadbearing thermal insulation system. We believe this is the first algorithm with provable convergence results to directly target this class of problems.
A Review of Kernel Methods in Machine Learning
, 2006
"... We review recent methods for learning with positive definite kernels. All these methods formulate learning and estimation problems as linear tasks in a reproducing kernel Hilbert space (RKHS) associated with a kernel. We cover a wide range of methods, ranging from simple classifiers to sophisticate ..."
Abstract

Cited by 35 (3 self)
 Add to MetaCart
We review recent methods for learning with positive definite kernels. All these methods formulate learning and estimation problems as linear tasks in a reproducing kernel Hilbert space (RKHS) associated with a kernel. We cover a wide range of methods, ranging from simple classifiers to sophisticated methods for estimation with structured data.
On Solving BlockStructured Indefinite Linear Systems
 SIAM J. Sci. Comput
, 2003
"... We consider 2 × 2 block indefinite linear systems whose (2,2) block is zero. Such systems arise in many applications. We focus on techniques that change the linear systems in a way that may make it easier to solve them. In particular, two techniques based on modifying the (1,1) block are consi ..."
Abstract

Cited by 28 (7 self)
 Add to MetaCart
We consider 2 × 2 block indefinite linear systems whose (2,2) block is zero. Such systems arise in many applications. We focus on techniques that change the linear systems in a way that may make it easier to solve them. In particular, two techniques based on modifying the (1,1) block are considered. The main part of the paper discusses an augmented Lagrangian approach, which is a technique that modifies the (1,1) block without changing the system size. The choice of the parameter involved, the spectrum of the linear system, and its condition number are discussed, and some analytical observations are provided. A technique of deating the (1,1) block is then introduced. Finally, numerical experiments which validate the analysis are presented.
PETSc users manual
 Tech. Rep. ANL95/11  Revision 2.1.5, Argonne National Laboratory
, 2004
"... This work was supported by the Mathematical, Information, and Computational Sciences ..."
Abstract

Cited by 26 (8 self)
 Add to MetaCart
This work was supported by the Mathematical, Information, and Computational Sciences
New methods for splice site recognition
, 2002
"... Splice sites are locations in DNA which separate proteincoding regions (exons) from noncoding regions (introns). Accurate splice site detectors thus form important components of computational gene finders. We pose splice site recognition as a classification problem with the classifier learnt from ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
Splice sites are locations in DNA which separate proteincoding regions (exons) from noncoding regions (introns). Accurate splice site detectors thus form important components of computational gene finders. We pose splice site recognition as a classification problem with the classifier learnt from a labeled data set consisting of only local information around the potential splice site. Note that finding the correct position of splice sites without using global information is a rather hard task. We analyze the genomes of the nematode Caenorhabditis elegans and of humans using specially designed support vector kernels. One of the kernels is adapted from our previous work on detecting translation initiation sites in vertebrates and another uses an extension to the wellknown Fisherkernel. We find excellent performance on both data sets.
Weightedsupport vector machines for predicting membrane protein types based on pseudoamino acid composition. Protein
 Sel
, 2004
"... Membrane proteins are generally classified into the following five types: (1) type I membrane protein, (2) type II membrane protein, (3) multipass transmembrane proteins, (4) lipid chainanchored membrane proteins, and (5) GPIanchored membrane proteins. Prediction of membrane protein types has becom ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
Membrane proteins are generally classified into the following five types: (1) type I membrane protein, (2) type II membrane protein, (3) multipass transmembrane proteins, (4) lipid chainanchored membrane proteins, and (5) GPIanchored membrane proteins. Prediction of membrane protein types has become one of the growing hot topics in bioinformatics. Currently, we are facing two critical challenges in this area. One is how to take into account the extremely complicated sequenceorder effects; the other is how to deal with the highly uneven sizes of the subsets in a training dataset. In this paper, stimulated by the concept of using the pseudoaminoacid composition (Chou, K.C.: PROTEINS: Structure, Function, and Genetics, 43: 246255, 2001; ibid. 2001, 44, 60) to incorporate the sequenceorder effects, the spectral analysis technique is introduced to represent the statistical sample of a protein. Based on such a framework, the weighted support vector machine (SVM) algorithm is applied. The new approach has a remarkable power in dealing with the bias caused by the situation when one subset in the training dataset
Complementarity Problems in GAMS and the PATH Solver
 Journal of Economic Dynamics and Control
, 1998
"... A fundamental mathematical problem is to find a solution to a square system of nonlinear equations. There are many methods to approach this problem, the most famous of which is Newton's method. In this paper, we describe a generalization of this problem, the complementarity problem. We show how such ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
A fundamental mathematical problem is to find a solution to a square system of nonlinear equations. There are many methods to approach this problem, the most famous of which is Newton's method. In this paper, we describe a generalization of this problem, the complementarity problem. We show how such problems are modeled within the GAMS modeling language and provide details about the PATH solver, a generalization of Newton's method, for finding a solution. While the modeling format is applicable in many disciplines, we draw the examples in this paper from an economic background. Finally, some extensions of the modeling format and the solver are described. Keywords: Complementarity problems, variational inequalities, algorithms AMS Classification: 90C33,65K10 This paper is an extended version of a talk presented at CEFES '98 (Computation in Economics, Finance and Engineering: Economic Systems) in Cambridge, England in July 1998 This material is based on research supported by Nationa...
Support Vector Machines for Predicting Membrane Protein Types by Incorporating Quasi–Sequence–Order Effect, Internet
 Biophys. J
, 2003
"... ABSTRACT Membrane proteins are generally classified into the following five types: 1), type I membrane protein; 2), type II membrane protein; 3), multipass transmembrane proteins; 4), lipid chainanchored membrane proteins; and 5), GPIanchored membrane proteins. In this article, based on the concep ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
ABSTRACT Membrane proteins are generally classified into the following five types: 1), type I membrane protein; 2), type II membrane protein; 3), multipass transmembrane proteins; 4), lipid chainanchored membrane proteins; and 5), GPIanchored membrane proteins. In this article, based on the concept of using the functional domain composition to define a protein, the Support Vector Machine algorithm is developed for predicting the membrane protein type. High success rates are obtained by both the selfconsistency and jackknife tests. The current approach, complemented with the powerful covariant discriminant algorithm based on the pseudoamino acid composition that has incorporated quasisequenceorder effect as recently proposed by K. C. Chou (2001), may become a very useful highthroughput tool in the area of bioinformatics and proteomics.