Results 1  10
of
10
Regularization Paths with Guarantees for Convex Semidefinite Optimization
"... We devise a simple algorithm for computing an approximate solution path for parameterized semidefinite convex optimization problems that is guaranteed to be εclose to the exact solution path. As a consequence, we can compute the entire regularization path for many regularized matrix completion and ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
We devise a simple algorithm for computing an approximate solution path for parameterized semidefinite convex optimization problems that is guaranteed to be εclose to the exact solution path. As a consequence, we can compute the entire regularization path for many regularized matrix completion and factorization approaches, as well as nuclear norm or weighted nuclear norm regularized convex optimization problems. This also includes robust PCA and variants of sparse PCA. On the theoretical side, we show that the approximate solution path has low complexity. This implies that the whole solution path can be computed efficiently. Our experiments demonstrate the practicality of the approach for large matrix completion problems. 1
Optimizing over the growing spectrahedron
 ESA 2012: 20th Annual European Symposium on Algorithms
, 2012
"... Abstract We devise a framework for computing an approximate solution path for an important class of parameterized semidefinite problems that is guaranteed to be εclose to the exact solution path. The problem of computing the entire regularization path for matrix factorization problems such as ma ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
Abstract We devise a framework for computing an approximate solution path for an important class of parameterized semidefinite problems that is guaranteed to be εclose to the exact solution path. The problem of computing the entire regularization path for matrix factorization problems such as maximummargin matrix factorization fits into this framework, as well as many other nuclear norm regularized convex optimization problems from machine learning. We show that the combinatorial complexity of the approximate path is independent of the size of the matrix. Furthermore, the whole solution path can be computed in near linear time in the size of the input matrix. The framework employs an approximative semidefinite program solver for a fixed parameter value. Here we use an algorithm that has recently been introduced by Hazan. We present a refined analysis of Hazan’s algorithm that results in improved running time bounds for a single solution as well as for the whole solution path as a function of the approximation guarantee. 1
Safe sample screening for Support Vector Machine. arXiv:1401.6740
, 2014
"... ar ..."
(Show Context)
Outlier Path: A Homotopy Algorithm for Robust SVM
"... In recent applications with massive but less reliable data (e.g., labels obtained by a semisupervised learning method or crowdsourcing), nonrobustness of the support vector machine (SVM) often causes considerable performance deterioration. Although improving the robustness of SVM has been invest ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In recent applications with massive but less reliable data (e.g., labels obtained by a semisupervised learning method or crowdsourcing), nonrobustness of the support vector machine (SVM) often causes considerable performance deterioration. Although improving the robustness of SVM has been investigated for long time, robust SVM (RSVM) learning still poses two major challenges: obtaining a good (local) solution from a nonconvex optimization problem and optimally controlling the robustnessefficiency tradeoff. In this paper, we address these two issues simultaneously in an integrated way by introducing a novel homotopy approach to RSVM learning. Based on theoretical investigation of the geometry of RSVM solutions, we show that a path of local RSVM solutions can be computed efficiently when the in uence of outliers is gradually suppressed as simulated annealing. We experimentally demonstrate that our algorithm tends to produce better local solutions than the alternative approach based on the concaveconvex procedure, with the ability of stable and efcient model selection for controlling the inuence of outliers.
Suboptimal Solution Path Algorithm for Support Vector Machine
"... We consider a suboptimal solution path algorithm for the Support Vector Machine. The solution path algorithm is an effective tool for solving a sequence of a parametrized optimization problems in machine learning. The path of the solutions provided by this algorithm are very accurate and they satisf ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We consider a suboptimal solution path algorithm for the Support Vector Machine. The solution path algorithm is an effective tool for solving a sequence of a parametrized optimization problems in machine learning. The path of the solutions provided by this algorithm are very accurate and they satisfy the optimality conditions more strictly than other SVM optimization algorithms. In many machine learning application, however, this strict optimality is often unnecessary, and it adversely affects the computational efficiency. Our algorithm can generate the path of suboptimal solutions within an arbitrary userspecified tolerance level. It allows us to control the tradeoff between the accuracy of the solution and the computational cost. Moreover, We also show that our suboptimal solutions can be interpreted as the solution of a perturbed optimization problem from the original one. We provide some theoretical analyses of our algorithm based on this novel interpretation. The experimental results also demonstrate the effectiveness of our algorithm. 1.
Sören Laue
"... We consider an abstract class of optimization problems that are parameterized concavely in a single parameter, and show that the solution path along the parameter can always be approximated with accuracy ε> 0 by a set of sizeO(1/ ε). A lower bound of size Ω(1/ ε) shows that the upper bound is ti ..."
Abstract
 Add to MetaCart
(Show Context)
We consider an abstract class of optimization problems that are parameterized concavely in a single parameter, and show that the solution path along the parameter can always be approximated with accuracy ε> 0 by a set of sizeO(1/ ε). A lower bound of size Ω(1/ ε) shows that the upper bound is tight up to a constant factor. We also devise an algorithm that calls a stepsize oracle and computes an approximate path of size O(1/ ε). Finally, we provide an implementation of the oracle for softmargin support vector machines, and a parameterized semidefinite program for matrix completion. 1
Journal of Computational Geometry jocg.org AN EXPONENTIAL LOWER BOUND ON THE COMPLEXITY OF REGULARIZATION PATHS
"... Abstract. For a variety of regularized optimization problems in machine learning, algorithms computing the entire solution path have been developed recently. Most of these methods are quadratic programs that are parameterized by a single parameter, as for example the Support Vector Machine (SVM). ..."
Abstract
 Add to MetaCart
Abstract. For a variety of regularized optimization problems in machine learning, algorithms computing the entire solution path have been developed recently. Most of these methods are quadratic programs that are parameterized by a single parameter, as for example the Support Vector Machine (SVM). Solution path algorithms do not only compute the solution for one particular value of the regularization parameter but the entire path of solutions, making the selection of an optimal parameter much easier. It has been assumed that these piecewise linear solution paths have only linear complexity, i.e. linearly many bends. We prove that for the support vector machine this complexity can be exponential in the number of training points in the worst case. More strongly, we construct a single instance of n input points in d dimensions for an SVM such that at least Θ(2n/2) = Θ(2d) many distinct subsets of support vectors occur as the regularization parameter changes. 1