Results 1 - 10
of
681
Coordinate descent
"... Minimize for x ∈ RN the composite function F min x∈RN {F (x) = f(x) +ψ(x)} • f: RN → R, convex, differentiable, not strongly convex • ψ: RN → R ∪ {+∞}, convex, separable ψ(x) = n∑ i=1 ψi(x ..."
Abstract
- Add to MetaCart
Minimize for x ∈ RN the composite function F min x∈RN {F (x) = f(x) +ψ(x)} • f: RN → R, convex, differentiable, not strongly convex • ψ: RN → R ∪ {+∞}, convex, separable ψ(x) = n∑ i=1 ψi(x
Regularization paths for generalized linear models via coordinate descent
, 2009
"... We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, twoclass logistic regression, and multinomial regression problems while the penalties include ℓ1 (the lasso), ℓ2 (ridge regression) and mixtures of the two (the elastic ..."
Abstract
-
Cited by 724 (15 self)
- Add to MetaCart
elastic net). The algorithms use cyclical coordinate descent, computed along a regularization path. The methods can handle large problems and can also deal efficiently with sparse features. In comparative timings we find that the new algorithms are considerably faster than competing methods.
Coordinate Descent Algorithms
, 2014
"... Coordinate descent algorithms solve optimization problems by successively performing approximate minimization along coordinate directions or coordinate hyperplanes. They have been used in applications for many years, and their popularity continues to grow because of their usefulness in data analysi ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
Coordinate descent algorithms solve optimization problems by successively performing approximate minimization along coordinate directions or coordinate hyperplanes. They have been used in applications for many years, and their popularity continues to grow because of their usefulness in data
ROBUST BLOCK COORDINATE DESCENT
, 2014
"... Abstract. In this paper we present a novel randomized block coordinate descent method for the minimization of a convex composite objective function. The method uses (approximate) partial second-order (curvature) information, so that the algorithm performance is more robust when applied to highly non ..."
Abstract
- Add to MetaCart
Abstract. In this paper we present a novel randomized block coordinate descent method for the minimization of a convex composite objective function. The method uses (approximate) partial second-order (curvature) information, so that the algorithm performance is more robust when applied to highly
Adaptive Coordinate Descent
- GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO 2011)
, 2011
"... Independence from the coordinate system is one source of efficiency and robustness for the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). The recently proposed AdaptiveEncoding(AE)proceduregeneralizes CMA-ESadaptive mechanism, and can be used together with any optimization algorithm. Adap ..."
Abstract
- Add to MetaCart
of the simplest of all, that uses a dichotomy procedure on each coordinate in turn. The resulting algorithm, termed Adaptive Coordinate Descent (ACiD), is analyzed on the Sphere function, and experimentally validated on BBOB testbench where it is shown to outperform the standard (1 +1)-CMA-ES, and is found
Convergence of a block coordinate descent method for nondifferentiable minimization
- J. OPTIM THEORY APPL
, 2001
"... We study the convergence properties of a (block) coordinate descent method applied to minimize a nondifferentiable (nonconvex) function f(x1,...,xN) with certain separability and regularity properties. Assuming that f is continuous on a compact level set, the subsequence convergence of the iterate ..."
Abstract
-
Cited by 298 (3 self)
- Add to MetaCart
We study the convergence properties of a (block) coordinate descent method applied to minimize a nondifferentiable (nonconvex) function f(x1,...,xN) with certain separability and regularity properties. Assuming that f is continuous on a compact level set, the subsequence convergence
Accelerated, parallel and proximal coordinate descent
, 2014
"... We propose a new stochastic coordinate descent method for minimizing the sum of convex functions each of which depends on a small number of coordinates only. Our method (APPROX) is simultaneously Accelerated, Parallel and PROXimal; this is the first time such a method is proposed. In the special cas ..."
Abstract
-
Cited by 31 (6 self)
- Add to MetaCart
We propose a new stochastic coordinate descent method for minimizing the sum of convex functions each of which depends on a small number of coordinates only. Our method (APPROX) is simultaneously Accelerated, Parallel and PROXimal; this is the first time such a method is proposed. In the special
Adaptive coordinate descent
- In Proceedings of the 13th annual conference on Genetic and evolutionary computation. ACM
, 2011
"... HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."
Abstract
-
Cited by 5 (1 self)
- Add to MetaCart
HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.
Inexact Coordinate Descent: Complexity and Preconditioning
, 2013
"... In this paper we consider the problem of minimizing a convex function using a randomized block coordinate descent method. One of the key steps at each iteration of the algorithm is determining the update to a block of variables. Existing algorithms assume that in order to compute the update, a parti ..."
Abstract
-
Cited by 16 (4 self)
- Add to MetaCart
In this paper we consider the problem of minimizing a convex function using a randomized block coordinate descent method. One of the key steps at each iteration of the algorithm is determining the update to a block of variables. Existing algorithms assume that in order to compute the update, a
Results 1 - 10
of
681