Results 1  10
of
26
On the curvature of the central path of linear programming theory
 Foundations of Computational Mathematics
, 2003
"... Abstract. We prove a linear bound on the average total curvature of the central path of linear programming theory in terms on the number of variables. 1 Introduction. In this paper we study the curvature of the central path of linear programming theory. We establish that for a linear programming pro ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
Abstract. We prove a linear bound on the average total curvature of the central path of linear programming theory in terms on the number of variables. 1 Introduction. In this paper we study the curvature of the central path of linear programming theory. We establish that for a linear programming problem defined on a compact polytope contained in R n, the total curvature of the central path is less than or
Complexity of Bézout’s theorem VI: Geodesics in the condition metric
 Found. Comput. Math
"... Abstract. We introduce a new complexity measure of a path of (problems, solutions) pairs in terms of the length of the path in the condition metric which we define in the article. The measure gives an upper bound for the number of Newton steps sufficient to approximate the path discretely starting f ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
Abstract. We introduce a new complexity measure of a path of (problems, solutions) pairs in terms of the length of the path in the condition metric which we define in the article. The measure gives an upper bound for the number of Newton steps sufficient to approximate the path discretely starting from one end and thus produce an approximate zero for the endpoint. This motivates the study of short paths or geodesics in the condition metric. 1.
Proximal point methods for quasiconvex and convex functions with Bregman distances on Hadamard manifolds
 J. Convex Anal
, 2009
"... This paper generalizes the proximal point method using Bregman distances to solve convex and quasiconvex optimization problems on noncompact Hadamard manifolds. We will proved that the sequence generated by our method is well defined and converges to an optimal solution of the problem. Also, we obta ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
This paper generalizes the proximal point method using Bregman distances to solve convex and quasiconvex optimization problems on noncompact Hadamard manifolds. We will proved that the sequence generated by our method is well defined and converges to an optimal solution of the problem. Also, we obtain the same convergence properties for the classical proximal method, applied to a class of quasiconvex problems. Finally, we give some examples of Bregman distances in nonEuclidean spaces.
A fullNewton step O(n) infeasible interiorpoint algorithm for linear optimization
, 2005
"... We present a primaldual infeasible interiorpoint algorithm. As usual, the algorithm decreases the duality gap and the feasibility residuals at the same rate. Assuming that an optimal solution exists it is shown that at most O(n) iterations suffice to reduce the duality gap and the residuals by the ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
We present a primaldual infeasible interiorpoint algorithm. As usual, the algorithm decreases the duality gap and the feasibility residuals at the same rate. Assuming that an optimal solution exists it is shown that at most O(n) iterations suffice to reduce the duality gap and the residuals by the factor 1/e. This implies an O(nlog(n/ε)) iteration bound for getting an εsolution of the problem at hand, which coincides with the best known bound for infeasible interiorpoint algorithms. The algorithm constructs strictly feasible iterates for a sequence of perturbations of the given problem and its dual problem. A special feature of the algorithm is that it uses only fullNewton steps. Two types of fullNewton steps are used, socalled feasibility steps and usual (centering) steps. Starting at strictly feasible iterates of a perturbed pair, (very) close its central path, feasibility steps serve to generate strictly feasible iterates for the next perturbed pair. By accomplishing a few centering steps for the new perturbed pair we obtain strictly feasible iterates close enough to the central path of the new perturbed pair. The algorithm finds an optimal solution or detects infeasibility or unboundedness of the given problem.
Consensus in noncommutative spaces
"... Abstract — Convergence analysis of consensus algorithms is revisited in the light of the Hilbert distance. The Lyapunov function used in the early analysis by Tsitsiklis is shown to be the Hilbert distance to consensus in log coordinates. Birkhoff theorem, which proves contraction of the Hilbert met ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract — Convergence analysis of consensus algorithms is revisited in the light of the Hilbert distance. The Lyapunov function used in the early analysis by Tsitsiklis is shown to be the Hilbert distance to consensus in log coordinates. Birkhoff theorem, which proves contraction of the Hilbert metric for any positive homogeneous monotone map, provides an early yet general convergence result for consensus algorithms. Because Birkhoff theorem holds in arbitrary cones, we extend consensus algorithms to the cone of positive definite matrices. The proposed generalization finds applications in the convergence analysis of quantum stochastic maps, which are a generalization of stochastic maps to noncommutative probability spaces. I.
Conefree” primaldual pathfollowing and potential reduction polynomial time interiorpoint methods
 Math. Prog
, 2005
"... Abstract. We present a framework for designing and analyzing primaldual interiorpoint methods for convex optimization. We assume that a selfconcordant barrier for the convex domain of interest and the Legendre transformation of the barrier are both available to us. We directly apply the theory an ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. We present a framework for designing and analyzing primaldual interiorpoint methods for convex optimization. We assume that a selfconcordant barrier for the convex domain of interest and the Legendre transformation of the barrier are both available to us. We directly apply the theory and techniques of interiorpoint methods to the given good formulation of the problem (as is, without a conic reformulation) using the very usual primal central path concept and a less usual version of a dual path concept. We show that many of the advantages of the primaldual interiorpoint techniques are available to us in this framework and therefore, they are not intrinsically tied to the conic reformulation and the logarithmic homogeneity of the underlying barrier function.
A Riemannian geometry with complete geodesics for the set of positive semidefinite matrices of fixed rank
, 2010
"... ..."
An Information Geometric Approach to Polynomialtime Interiorpoint Algorithms — Complexity Bound via Curvature Integral —
, 2007
"... In this paper, we study polynomialtime interiorpoint algorithms in view of information geometry. Information geometry is a differential geometric framework which has been successfully applied to statistics, learning theory, signal processing etc. We consider information geometric structure for con ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper, we study polynomialtime interiorpoint algorithms in view of information geometry. Information geometry is a differential geometric framework which has been successfully applied to statistics, learning theory, signal processing etc. We consider information geometric structure for conic linear programs introduced by selfconcordant barrier functions, and develop a precise iterationcomplexity estimate of the polynomialtime interiorpoint algorithm based on an integral of (embedding) curvature of the central trajectory in a rigorous differential geometrical sense. We further study implication of the theory applied to classical linear programming, and establish a surprising link to the strong “primaldual curvature ” integral bound established by Monteiro and Tsuchiya, which is based on the work of Vavasis and Ye of the layeredstep interiorpoint algorithm. By using these results, we can show that the total embedding curvature of the central trajectory, i.e., the aforementioned integral over the whole central trajectory, is bounded by O(n3.5 log(¯χ ∗ A + n)) where ¯χ ∗ A is a condition number of the coefficient matrix A and n is the number of nonnegative variables. In particular, the integral is bounded by O(n4.5m) for combinatorial linear programs including network flow problems where m is the number of constraints. We also provide a complete differentialgeometric characterization of the primaldual curvature in the primaldual algorithm. Finally, in view of this integral bound, we observe that the primal (or dual) interiorpoint algorithm requires fewer number of iterations than the primaldual interiorpoint algorithm at least in the case of linear programming.