Results 1  10
of
13
Constructing Boosting Algorithms from SVMs: An Application to Oneclass Classification
, 2002
"... ..."
Barrier Boosting
"... Boosting algorithms like AdaBoost and ArcGV are iterative strategies to minimize a constrained objective function, equivalent to Barrier algorithms. ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
Boosting algorithms like AdaBoost and ArcGV are iterative strategies to minimize a constrained objective function, equivalent to Barrier algorithms.
Sparse Regression Ensembles in Infinite and Finite Hypothesis Spaces
, 2000
"... We examine methods for constructing regression ensembles based on a linear program (LP). The ensemble regression function consists of linear combina tions of base hypotheses generated by some boostingtype base learning algorithm. Unlike the classification case, for regression the set of possible h ..."
Abstract

Cited by 18 (9 self)
 Add to MetaCart
We examine methods for constructing regression ensembles based on a linear program (LP). The ensemble regression function consists of linear combina tions of base hypotheses generated by some boostingtype base learning algorithm. Unlike the classification case, for regression the set of possible hypotheses producible by the base learning algorithm may be infinite. We explicitly tackle the issue of how to define and solve ensemble regression when the hypothesis space is infinite. Our approach is based on a semiinfinite linear program that has an infinite number of constraints and a finite number of variables. We show that the regression problem is well posed for infinite hypothesis spaces in both the primal and dual spaces. Most importantly, we prove there exists an optimal solution to the infinite hypothesisspace problem consisting of a finite number of hypothesis. We propose two algorithms for solving the infinite and finite hypothesis problems. One uses a column generation simplextype algorithm and the other adopts an exponential barrier approach. Furthermore, we give sufficient conditions for the base learning algorithm and the hypothesis set to be used for infinite regression ensembles. Computational resultsshow that these methods are extremely promising.
A spectral quadraticSDP method with applications to fixedorder H2 and H∞ synthesis. Asian Control Conference
, 2004
"... In this paper, we discuss a spectral quadraticSDP method for the iterative resolution of fixedorder H2 and H ∞ design problems. These problems can be cast as regular SDP programs with additional nonlinear equality constraints. When the inequalities are absorbed into a Lagrangian function the probl ..."
Abstract

Cited by 13 (7 self)
 Add to MetaCart
In this paper, we discuss a spectral quadraticSDP method for the iterative resolution of fixedorder H2 and H ∞ design problems. These problems can be cast as regular SDP programs with additional nonlinear equality constraints. When the inequalities are absorbed into a Lagrangian function the problem reduces to solving a sequence of SDPs with quadratic objective function for which a spectral SDP method has been developed. Along with a description of the spectral SDP method used to solve the tangent subproblems, we report a number of computational results for validation purposes.
Smoothing Method of Multipliers for SumMax Problems
 Online]. Available: http://iew3.technion.ac.il/ ∼ mcib
, 2002
"... We study nonsmooth unconstrained optimization problem, which includes sum of pairwise maxima of smooth functions. Minimum l 1norm approximation is a particular case of this problem. Combining ideas Lagrange multipliers with smooth approximation of maxtype function, we obtain a new kind of nonq ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
We study nonsmooth unconstrained optimization problem, which includes sum of pairwise maxima of smooth functions. Minimum l 1norm approximation is a particular case of this problem. Combining ideas Lagrange multipliers with smooth approximation of maxtype function, we obtain a new kind of nonquadratic augmented Lagrangian. Our approach does not require arti cial variables, and preserves sparse structure of Hessian in many practical cases. We present the corresponding method of multipliers, and its convergence analysis for a dual counterpart, resulting in a proximal point maximization algorithm. The practical eciency of the algorithm is supported by computational results for largescale problems, arising in structural optimization.
SVM and Boosting: One Class
"... We show via an equivalence of mathematical programs that a Support Vector (SV) algorithm can be translated into an equivalent boostinglike algorithm and vice versa. We exemplify this translation procedure for a new algorithm oneclass Leveraging starting from the oneclass Support Vector Machine ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
We show via an equivalence of mathematical programs that a Support Vector (SV) algorithm can be translated into an equivalent boostinglike algorithm and vice versa. We exemplify this translation procedure for a new algorithm oneclass Leveraging starting from the oneclass Support Vector Machines (1SVM) . This is a first step towards unsupervised learning in a Boosting framework.
Pattern Recognition via Support Vector Machine with Computationally Efficient Nonlinear Transform
, 1998
"... We introduce a new "sparse" nonlinear transformation of the SVM feature space, which permits the use of efficient optimization techniques for finding separating hyperplane. Corresponding quadratic program can be solved in the primal formulation, so that complexity of solution grows only linearly wit ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We introduce a new "sparse" nonlinear transformation of the SVM feature space, which permits the use of efficient optimization techniques for finding separating hyperplane. Corresponding quadratic program can be solved in the primal formulation, so that complexity of solution grows only linearly with the number of training examples. The Penalty/Barrier Multiplier method (a kind of nonquadratic Augmented Lagrangian) was successfully applied for the above mentioned problem. It provides a very high accuracy of 10  12 digits under moderate value of the penalty parameter. 10  20 Newton steps are normally sufficient to solve a problem. 1 Problem Formulation and the Main Approach In this paper we deal with so called Support Vector Machine [7] for pattern recognition via separation by optimal hyperplane in extended feature space. Consider a pattern recognition problem in the following setting. Let f : R n ! R separate two classes, represented by sets A; B 2 R n : f(x) = 1; x 2 A 1...
7 Relative Newton and Smoothing Multiplier Optimization Methods for Blind Source Separation ⋆
"... Abstract. We study a relative optimization framework for quasimaximum likelihood blind source separation and relative Newton method as its particular instance. The structure of the Hessian allows its fast approximate inversion. In the second part we present Smoothing Method of Multipliers (SMOM) fo ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. We study a relative optimization framework for quasimaximum likelihood blind source separation and relative Newton method as its particular instance. The structure of the Hessian allows its fast approximate inversion. In the second part we present Smoothing Method of Multipliers (SMOM) for minimization of sum of pairwise maxima of smooth functions, in particular sum of absolute value terms. Incorporating Lagrange multiplier into a smooth approximation of maxtype function, we obtain an extended notion of nonquadratic augmented Lagrangian. Our approach does not require artificial variables, and preserves the sparse structure of Hessian. Convergence of the method is further accelerated by the Frozen Hessian strategy. We demonstrate efficiency of this approach on an example of blind separation of sparse sources. The nonlinearity in this case is based on the absolute value function, which provides superefficient source separation. In this chapter we study quasimaximum likelihood blind source separation (quasiML BSS) [1,2] in batch mode, without orthogonality constraint. This
unknown title
"... Abstract In this paper, we discuss a spectral quadraticSDP method for the iterative resolution of fixedorder H2 and H1 design problems. These problems can be cast as regular SDP programs with additional nonlinear equality constraints. When the inequalities are absorbed into a Lagrangian function t ..."
Abstract
 Add to MetaCart
Abstract In this paper, we discuss a spectral quadraticSDP method for the iterative resolution of fixedorder H2 and H1 design problems. These problems can be cast as regular SDP programs with additional nonlinear equality constraints. When the inequalities are absorbed into a Lagrangian function the problem reduces to solving a sequence of SDPs with quadratic objective function for which a spectral SDP method has been developed. Besides a description of the spectral SDP method used to solve the tangent subproblems, we report a number of computational results for validation purposes.