Results 1  10
of
11
Multiclass multiple kernel learning
 In ICML. ACM
"... In many applications it is desirable to learn from several kernels. “Multiple kernel learning” (MKL) allows the practitioner to optimize over linear combinations of kernels. By enforcing sparse coefficients, it also generalizes feature selection to kernel selection. We propose MKL for joint feature ..."
Abstract

Cited by 56 (3 self)
 Add to MetaCart
(Show Context)
In many applications it is desirable to learn from several kernels. “Multiple kernel learning” (MKL) allows the practitioner to optimize over linear combinations of kernels. By enforcing sparse coefficients, it also generalizes feature selection to kernel selection. We propose MKL for joint feature maps. This provides a convenient and principled way for MKL with multiclass problems. In addition, we can exploit the joint feature map to learn kernels on output spaces. We show the equivalence of several different primal formulations including different regularizers. We present several optimization methods, and compare a convex quadratically constrained quadratic program (QCQP) and two semiinfinite linear programs (SILPs) on toy data, showing that the SILPs are faster than the QCQP. We then demonstrate the utility of our method by applying the SILP to three real world datasets. 1.
Theory and Design of SignalAdapted FIR Paraunitary Filter Banks
 IEEE TRANS. SIGNAL PROCESSING
, 1998
"... We study the design of signaladapted FIR paraunitary filter banks, using energy compaction as the adaptation criterion. We present some important properties that globally optimal solutions to this optimization problem satisfy. In particular, we show that the optimal filters in the first channel ..."
Abstract

Cited by 38 (6 self)
 Add to MetaCart
We study the design of signaladapted FIR paraunitary filter banks, using energy compaction as the adaptation criterion. We present some important properties that globally optimal solutions to this optimization problem satisfy. In particular, we show that the optimal filters in the first channel of the filter bank are spectral factors of the solution to a linear semiinfinite programming (SIP) problem. The remaining filters are related to the first through a matrix eigenvector decomposition. We discuss
An SQP Algorithm For Finely Discretized Continuous Minimax Problems And Other Minimax Problems With Many Objective Functions
, 1996
"... . A common strategy for achieving global convergence in the solution of semiinfinite programming (SIP) problems, and in particular of continuous minimax problems, is to (approximately) solve a sequence of discretized problems, with a progressively finer discretization meshes. Finely discretized min ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
. A common strategy for achieving global convergence in the solution of semiinfinite programming (SIP) problems, and in particular of continuous minimax problems, is to (approximately) solve a sequence of discretized problems, with a progressively finer discretization meshes. Finely discretized minimax and SIP problems, as well as other problems with many more objectives /constraints than variables, call for algorithms in which successive search directions are computed based on a small but significant subset of the objectives/constraints, with ensuing reduced computing cost per iteration and decreased risk of numerical difficulties. In this paper, an SQPtype algorithm is proposed that incorporates this idea in the particular case of minimax problems. The general case will be considered in a separate paper. The quadratic programming subproblem that yields the search direction involves only a small subset of the objective functions. This subset is updated at each iteration in such a wa...
An automated combination of kernels for predicting protein subcellular localization.
, 2007
"... Protein subcellular localization is a crucial ingredient to many important inferences about cellular processes, including prediction of protein function and protein interactions. While many predictive computational tools have been proposed, they tend to have complicated architectures and require man ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
Protein subcellular localization is a crucial ingredient to many important inferences about cellular processes, including prediction of protein function and protein interactions. While many predictive computational tools have been proposed, they tend to have complicated architectures and require many design decisions from the developer. Here we utilize the multiclass support vector machine (mSVM) method to directly solve protein subcellular localization without resorting to the common approach of splitting the problem into several binary classification problems. We further propose a general class of protein sequence kernels which considers all motifs, including motifs with gaps. Instead of heuristically selecting one or a few kernels from this family, we utilize a recent extension of SVMs that optimizes over multiple kernels simultaneously. This way, we automatically search over families of possible amino acid motifs. We compare our automated approach to three other predictors on four different datasets, and show that we perform better than the current state of the art. Further, our method provides some insights as to which sequence motifs are most useful for determining subcellular localization, which are in agreement with biological reasoning. Data files, kernel matrices and open source software are available at
Relaxed Cutting Plane Method for Solving Linear SemiInfinite Programming Problems
, 1998
"... . One of the major computational tasks of using the traditional cutting plane approach to solve linear semiinfinite programming problems lies in finding a global optimizer of a nonlinear and nonconvex program. This paper generalizes Gustafson and Kortanek's scheme to relax this requirement. ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
. One of the major computational tasks of using the traditional cutting plane approach to solve linear semiinfinite programming problems lies in finding a global optimizer of a nonlinear and nonconvex program. This paper generalizes Gustafson and Kortanek's scheme to relax this requirement. In each iteration, the proposed method chooses a point at which the infinite constraints are violated to a degree rather than at which the violation are maximized. A convergence proof of the proposed scheme is provided. Some computational results are included. An explicit algorithm which allows the unnecessary constraints to be dropped in each iteration is also introduced to reduce the size of computed programs. Key Words: Linear semiinfinite programming, cutting plane method. 1 Introduction Consider the following linear semiinfinite programming problem (LSIP ) min n X j=1 c j x j s.t. n X j=1 x j f j (t) g(t); 8t 2 T (1) x j 0; j = 1; \Delta \Delta \Delta ; n (2) where T is a com...
Testing of Monotonicity in Regression Models
 Mimeograph Series, Operations Research, Statistics
, 1990
"... In data anaysis concerning the investigation of the relationship between a dependent variable Y and an independent variable X, one may wish to determine whether this relationship is monotone or not. This determination may be of interest in itself, or it may form part of a (nonparametric) regression ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
In data anaysis concerning the investigation of the relationship between a dependent variable Y and an independent variable X, one may wish to determine whether this relationship is monotone or not. This determination may be of interest in itself, or it may form part of a (nonparametric) regression analysis which relies on monotonicity of the true regression function. In this paper we generalize the test of positive correlation by proposing a test statistic for monotonicity based on fitting a parametric model, say a higher order polynomial, to the data with and without the monotonicity constraint. The statistic has an asymptotic chibarsquared distribution under the null hypothesis that the true regression function is on the boundary of the space of monotone functions. Based on the theoretical results, an algorithm is developed for testing the significance of the statistic, and it is shown to perform well in several null and nonnull settings. Extensions to fitting regression splines ...
An automated combination of sequence motif kernels for predicting protein subcellular localization
, 2006
"... Protein subcellular localization is a crucial ingredient to many important inferences about cellular processes, including prediction of protein function and protein interactions. While many predictive computational tools have been proposed, they tend to have complicated architectures and require man ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Protein subcellular localization is a crucial ingredient to many important inferences about cellular processes, including prediction of protein function and protein interactions. While many predictive computational tools have been proposed, they tend to have complicated architectures and require many design decisions from the developer. We propose an elegant and fully automated approach to building a prediction system for protein subcellular localization. We propose a new class of protein sequence kernels which considers all motifs including motifs with gaps. This class of kernels allows the inclusion of pairwise amino acid distances into their computation. We further propose a multiclass support vector machine method which directly solves protein subcellular localization without resorting to the common approach of splitting the problem into several binary classification problems. To automatically search over families of possible amino acid motifs, we generalize our method to optimize over multiple kernels at the same time. We compare our automated approach to four other predictors on three different datasets.