Results 1  10
of
31
On the limited memory BFGS method for large scale optimization
 Mathematical Programming
, 1989
"... this paper has appeared in ..."
Inverse Kinematics Positioning Using Nonlinear Programming for Highly Articulated Figures
 ACM Transactions on Graphics
, 1994
"... An articulated figure is often modeled as a set of rigid segments connected with joints. Its configuration can be altered by varying the joint angles. Although it is straightforward to compute figure configurations given joint angles (forward kinematics), it is not so to find the joint angles for ..."
Abstract

Cited by 101 (9 self)
 Add to MetaCart
An articulated figure is often modeled as a set of rigid segments connected with joints. Its configuration can be altered by varying the joint angles. Although it is straightforward to compute figure configurations given joint angles (forward kinematics), it is not so to find the joint angles for a desired configuration (inverse kinematics). Since the inverse kinematics problem is of special importance to an animator wishing to set a figure to a posture satisfying a set of positioning constraints, researchers have proposed many approaches. But when we try to follow these approaches in an interactive animation system where the object to operate on is as highly articulated as a realistic human figure, they fail in either generality or performance, and so a new approach is fostered. Our approach is based on nonlinear programming techniques. It has been used for several years in the spatial constraint system in the Jack TM human figure simulation software developed at the Compute...
Line Search Algorithms With Guaranteed Sufficient Decrease
 ACM Trans. Math. Software
, 1992
"... The problem of finding a point that satisfies the sufficient decrease and curvature condition is formulated in terms of finding a point in a set T (). We describe a search algorithms for this problem that produces a sequence of iterates that converge to a point in T () and that, except for pathologi ..."
Abstract

Cited by 86 (0 self)
 Add to MetaCart
The problem of finding a point that satisfies the sufficient decrease and curvature condition is formulated in terms of finding a point in a set T (). We describe a search algorithms for this problem that produces a sequence of iterates that converge to a point in T () and that, except for pathological cases, terminates in a finite number of steps. Numerical results for an implementation of the search algorithm on a set of test functions show that the algorithm terminates within a small number of iterations. LINE SEARCH ALGORITHMS WITH GUARANTEED SUFFICIENT DECREASE Jorge J. Mor'e and David J. Thuente 1 Introduction Given a continuously differentiable function OE : IR ! IR defined on [0; 1) with OE 0 (0) ! 0, and constants and j in (0; 1), we are interested in finding an ff ? 0 such that OE(ff) OE(0) + OE 0 (0)ff (1:1) and jOE 0 (ff)j jjOE 0 (0)j: (1:2) The development of a search procedure that satisfies these conditions is a crucial ingredient in a line search meth...
Theory of Algorithms for Unconstrained Optimization
, 1992
"... this article I will attempt to review the most recent advances in the theory of unconstrained optimization, and will also describe some important open questions. Before doing so, I should point out that the value of the theory of optimization is not limited to its capacity for explaining the behavio ..."
Abstract

Cited by 84 (1 self)
 Add to MetaCart
this article I will attempt to review the most recent advances in the theory of unconstrained optimization, and will also describe some important open questions. Before doing so, I should point out that the value of the theory of optimization is not limited to its capacity for explaining the behavior of the most widely used techniques. The question
Variable Metric Bundle Methods: from Conceptual to Implementable Forms
, 1996
"... To minimize a convex function, we combine MoreauYosida regularizations, quasiNewton matrices and bundling mechanisms. First we develop conceptual forms using "reversal " quasiNewton formulae and we state their global and local convergence. Then, to produce implementable versions, we inco ..."
Abstract

Cited by 40 (8 self)
 Add to MetaCart
To minimize a convex function, we combine MoreauYosida regularizations, quasiNewton matrices and bundling mechanisms. First we develop conceptual forms using "reversal " quasiNewton formulae and we state their global and local convergence. Then, to produce implementable versions, we incorporate a bundle strategy together with a "curvesearch". No convergence results are given for the implementable versions; however some numerical illustrations show their good behaviour even for largescale problems.
LimitedMemory Matrix Methods with Applications
, 1997
"... Abstract. The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory � thereby allowing problems with a very large number of variables to be solved. Speci�cally � we will focus on two applications areas � optimization and information retrieval. We introdu ..."
Abstract

Cited by 30 (6 self)
 Add to MetaCart
Abstract. The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory � thereby allowing problems with a very large number of variables to be solved. Speci�cally � we will focus on two applications areas � optimization and information retrieval. We introduce a general algebraic form for the matrix update in limited�memory quasi� Newton methods. Many well�known methods such as limited�memory Broyden Family meth� ods satisfy the general form. We are able to prove several results about methods which sat� isfy the general form. In particular � we show that the only limited�memory Broyden Family method �using exact line searches � that is guaranteed to terminate within n iterations on an n�dimensional strictly convex quadratic is the limited�memory BFGS method. Further� more � we are able to introduce several new variations on the limited�memory BFGS method that retain the quadratic termination property. We also have a new result that shows that full�memory Broyden Family methods �using exact line searches � that skip p updates to the quasi�Newton matrix will terminate in no more than n�p steps on an n�dimensional strictly convex quadratic. We propose several new variations on the limited�memory BFGS method
Meanshift analysis using quasinewton methods
 Proceedings of the International Conference on Image Processing 3 (2003) 447 – 450
, 2003
"... Meanshift analysis is a general nonparametric clustering technique based on density estimation for the analysis of complex feature spaces. The algorithm consists of a simple iterative procedure that shifts each of the feature points to the nearest stationary point along the gradient directions of t ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
Meanshift analysis is a general nonparametric clustering technique based on density estimation for the analysis of complex feature spaces. The algorithm consists of a simple iterative procedure that shifts each of the feature points to the nearest stationary point along the gradient directions of the estimated density function. It has been successfully applied to many applications such as segmentation and tracking. However, despite its promising performance, there are applications for which the algorithm converges too slowly to be practical. We propose and implement an improved version of the meanshift algorithm using quasiNewton methods to achieve higher convergence rates. Another benefit of our algorithm is its ability to achieve clustering even for very complex and irregular featurespace topography. Experimental results demonstrate the efficiency and effectiveness of our algorithm. 1.
Proximal QuasiNewton Methods for Nondifferentiable Convex Optimization
 Mathematical Programming
, 1998
"... This paper proposes an implementable proximal quasiNewton method for minimizing a nondifferentiable convex function f in ! n . The method is based on Rockafellar's proximal point algorithm and a cuttingplane technique. At each step, we use an approximate proximal point p a (x k ) of x k to def ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
This paper proposes an implementable proximal quasiNewton method for minimizing a nondifferentiable convex function f in ! n . The method is based on Rockafellar's proximal point algorithm and a cuttingplane technique. At each step, we use an approximate proximal point p a (x k ) of x k to define a v k 2 @ ffl k f(p a (x k )) with ffl k ffkv k k; where ff is a constant. The method monitors the reduction in the value of kv k k to identify when a line search on f should be used. The quasiNewton step is used to reduce the value of kv k k. Without the differentiability of f , the method converges globally and the rate of convergence is Qlinear. Superlinear convergence is also discussed to extend the characterization result of Dennis and Mor'e. Numerical results show the good performance of the method. Key words. nondifferentiable convex optimization, proximal point, quasiNewton method, cuttingplane method, bundle methods. AMS subject classifications. 65K05, 90C30 Abbrevia...
Combining Trust Region and Line Search Techniques
"... We propose an algorithm for nonlinear optimization that employs both trust region techniques and line searches. Unlike traditional trust region methods, our algorithm does not resolve the subproblem if the trial step results in an increase in the objective function, but instead performs a backtr ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
We propose an algorithm for nonlinear optimization that employs both trust region techniques and line searches. Unlike traditional trust region methods, our algorithm does not resolve the subproblem if the trial step results in an increase in the objective function, but instead performs a backtracking line search from the failed point. Backtracking can be done along a straight line or along a curved path. We show that the new algorithm preserves the strong convergence properties of trust region methods. Numerical results are also presented.
A feasible BFGS interior point algorithm for solving strongly convex minimization problems
 SIAM J. OPTIM
, 2000
"... We propose a BFGS primaldual interior point method for minimizing a convex function on a convex set defined by equality and inequality constraints. The algorithm generates feasible iterates and consists in computing approximate solutions of the optimality conditions perturbed by a sequence of posit ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
We propose a BFGS primaldual interior point method for minimizing a convex function on a convex set defined by equality and inequality constraints. The algorithm generates feasible iterates and consists in computing approximate solutions of the optimality conditions perturbed by a sequence of positive parameters µ converging to zero. We prove that it converges qsuperlinearly for each fixed µ. We also show that it is globally convergent to the analytic center of the primaldual optimalset when µ tends to 0 and strict complementarity holds.