• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 45
Next 10 →

On the projected subgradient method for nonsmooth convex optimization in a Hilbert space

by Ya. I. Alber, A. N. Iusem, M. V. Solodov - Mathematical Programming , 1998
"... We consider the method for constrained convex optimization in a Hilbert space, consisting of a step in the direction opposite to an ek-subgradient of the objective at a current iterate, followed by an orthogonal projection onto the feasible set. The normalized stepsizes ek are exogenously given, sat ..."
Abstract - Cited by 33 (4 self) - Add to MetaCart
We consider the method for constrained convex optimization in a Hilbert space, consisting of a step in the direction opposite to an ek-subgradient of the objective at a current iterate, followed by an orthogonal projection onto the feasible set. The normalized stepsizes ek are exogenously given

Globally Convergent Parallel MAP LP Relaxation Solver using the Frank-Wolfe Algorithm

by Alexander G. Schwing, Marc Pollefeys, Raquel Urtasun
"... Estimating the most likely configuration (MAP) is one of the fundamental tasks in probabilis-tic models. While MAP inference is typi-cally intractable for many real-world applica-tions, linear programming relaxations have been proven very effective. Dual block-coordinate descent methods are among th ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
have been proposed. In this paper we suggest to decouple the quadratic pro-gram based on the Frank-Wolfe approach. This allows us to obtain an efficient and easy to par-allelize algorithm while retaining the global con-vergence properties. Our method proves superior when compared to existing algorithms

Incremental Constraint Projection-Proximal Methods for Nonsmooth Convex Optimization

by Mengdi Wang, Dimitri P. Bertsekas
"... We consider convex optimization problems with structures that are suitable for stochastic sampling. In particular, we focus on problems where the objective function is an expected value or is a sum of a large number of component functions, and the constraint set is the intersection of a large number ..."
Abstract - Cited by 3 (2 self) - Add to MetaCart
number of simpler sets. We propose an algorithmic framework for projection-proximal methods using random subgradient/function updates and random constraint updates, which contain as special cases several known algorithms as well as new algorithms. To analyze the convergence of these algorithms in a

An Infeasible-Point Subgradient Method Using Adaptive Approximate Projections ⋆

by Dirk A. Lorenz, Marc E. Pfetsch, Andreas M. Tillmann
"... Abstract. We propose a new subgradient method for the minimization of nonsmooth convex functions over a convex set. To speed up computations we use adaptive approximate projections only requiring to move within a certain distance of the exact projections (which decreases in the course of the algorit ..."
Abstract - Cited by 2 (1 self) - Add to MetaCart
Abstract. We propose a new subgradient method for the minimization of nonsmooth convex functions over a convex set. To speed up computations we use adaptive approximate projections only requiring to move within a certain distance of the exact projections (which decreases in the course

A projected subgradient method for scalable multi-task learning

by Ariadna Quattoni, Xavier Carreras, Trevor Darrell , 2008
"... Address email Recent approaches to multi-task learning have investigated the use of a variety of matrix norm regularization schemes for promoting feature sharing across tasks. In essence, these approaches aim at extending the l1 framework for sparse sin-gle task approximation to the multi-task setti ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
on setting jointly regularized loss minimization as a con-vex constrained optimization problem for which we develop an efficient projected gradient algorithm. The main contribution of this paper is the derivation of a gradi-ent projection method with l1− ∞ constraints that can be performed efficiently

An extended level method for efficient multiple kernel learning

by Zenglin Xu, Rong Jin, Irwin King, Michael R. Lyu - Advances in Neural Information Processing Systems 21 , 2009
"... We consider the problem of multiple kernel learning (MKL), which can be for-mulated as a convex-concave problem. In the past, two efficient methods, i.e., Semi-Infinite Linear Programming (SILP) and Subgradient Descent (SD), have been proposed for large-scale multiple kernel learning. Despite their ..."
Abstract - Cited by 60 (10 self) - Add to MetaCart
for optimizing non-smooth ob-jective functions, to convex-concave optimization, and apply it to multiple kernel learning. The extended level method overcomes the drawbacks of SILP and SD by exploiting all the gradients computed in past iterations and by regularizing the solution via a projection to a level set

SYNTHESIS OF CUTTING AND SEPARATING PLANES IN A NONSMOOTH OPTIMIZATION METHOD 1

by E. A. Vorontsova, E. A. Nurminski
"... Abstract. A solution algorithm is proposed for problems of nondifferentiable optimization of a family of separating plane methods with additional clippings generated by the solution of an auxiliary problem of the cutting plane method. The convergence of this algorithm is proved, and the results of c ..."
Abstract - Add to MetaCart
Abstract. A solution algorithm is proposed for problems of nondifferentiable optimization of a family of separating plane methods with additional clippings generated by the solution of an auxiliary problem of the cutting plane method. The convergence of this algorithm is proved, and the results

A.N.: A strongly convergent method for nonsmooth convex minimization in Hilbert spaces. Numerical Functional Analysis and Optimization 32

by J Y Bello Cruz , A N Iusem , 2011
"... Abstract In this paper we propose a strongly convergent variant on the projected subgradient method for constrained convex minimization problems in Hilbert spaces. The advantage of the proposed method is that it converges strongly when the problem has solutions, without additional assumptions. The ..."
Abstract - Cited by 5 (0 self) - Add to MetaCart
Abstract In this paper we propose a strongly convergent variant on the projected subgradient method for constrained convex minimization problems in Hilbert spaces. The advantage of the proposed method is that it converges strongly when the problem has solutions, without additional assumptions

A linearly convergent conditional gradient algorithm with applications to online and stochastic optimization

by Dan Garber, Elad Hazan , 2013
"... Linear optimization is many times algorithmically simpler than non-linear convex optimization. Linear optimization over matroid polytopes, matching polytopes and path polytopes are example of problems for which we have simple and efficient combinatorial algorithms, but whose non-linear convex count ..."
Abstract - Cited by 11 (2 self) - Add to MetaCart
question of Kalai and Vempala, and Hazan and Kale. Our online algorithms also imply conditional gradient algorithms for non-smooth and stochastic convex optimization with the same convergence rates as projected (sub)gradient methods. Key words. frank-wolfe algorithm; conditional gradient methods; linear

On A Nonsmooth Newton Method For Nonlinear Complementarity Problems In Function Space With Applications To Optimal Control

by Michael Ulbrich , 2000
"... Many applications in mathematical modeling and optimal control lead to problems that are posed in function spaces and contain pointwise complementarity conditions. In this paper, a projected Newton method for nonlinear complementarity problems in the innite dimensional function space L p is propos ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Many applications in mathematical modeling and optimal control lead to problems that are posed in function spaces and contain pointwise complementarity conditions. In this paper, a projected Newton method for nonlinear complementarity problems in the innite dimensional function space L p
Next 10 →
Results 1 - 10 of 45
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2016 The Pennsylvania State University