Results 11 
15 of
15
Evaluating Gradients in Optimal Control  Continuous Adjoints versus Automatic
 J. Optim. Theory Appl
, 2002
"... This paper deals with the numerical solution of optimal control problems for ODEs. The methods considered here rely on some standard optimization code to solve a discretized version of the control problem under consideration. We aim at providing the optimization software not only with the discre ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper deals with the numerical solution of optimal control problems for ODEs. The methods considered here rely on some standard optimization code to solve a discretized version of the control problem under consideration. We aim at providing the optimization software not only with the discrete objective functional, but also with its gradient. The objective gradient can be computed either from forward (sensitivity) or backward (adjoint) information.
A Modeling Interface to NonLinear Programming Solvers  An instance: xMPS, the extended MPS format
, 2000
"... We present a ModelerOptimizer Interface (MOI) for general closed form NonLinear Programs (NLP), which can be used to to transfer NLPs in a clear and simple manner between optimization components in a distributed environment. We demonstrate how this interface allows rst order derivative informat ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We present a ModelerOptimizer Interface (MOI) for general closed form NonLinear Programs (NLP), which can be used to to transfer NLPs in a clear and simple manner between optimization components in a distributed environment. We demonstrate how this interface allows rst order derivative information to be easily calculated on the optimizer's side, using automatic dierentiation, hence removing the bottleneck of communicating derivative information between the modeler and the optimizer. We also show how this interface directly corresponds to a le format for NLPs, the extended MPS format (xMPS). This format directly extends the standard MPS le format for linear and mixed integer programs to include NLPs and permits a standardized way of transferring benchmark problems. The format spares the modeler the tedious task of calculating derivative information with minimal extra work required by the optimizer and thus increases eciency. This work was originally done at Maximal Sof...
Shared Computations for Efficient Interval Function Evaluation
"... In global optimization using interval analysis, functions, their gradients and other expressions are evaluated again and again  often after small changes in argumentvalues. This suggests that intermediate results obtained earlier in the algorithm could be reused later in order to speed up computat ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
In global optimization using interval analysis, functions, their gradients and other expressions are evaluated again and again  often after small changes in argumentvalues. This suggests that intermediate results obtained earlier in the algorithm could be reused later in order to speed up computations. This paper develops such techniques and shows that substantial improvements in speed are indeed possible. These methods can be used with many other interval algorithms as well.
Adjoint Codes in Functional Framework
, 2000
"... We show how to implement functionally the reverse, or adjoint strategy within the domain of Computational Dierentiation techniques { tools permitting to compute numerically, but exactly (i.e. up to the machine precision) the derivatives of coded functions. The imperative coding of the reverse techni ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We show how to implement functionally the reverse, or adjoint strategy within the domain of Computational Dierentiation techniques { tools permitting to compute numerically, but exactly (i.e. up to the machine precision) the derivatives of coded functions. The imperative coding of the reverse techniques is awkward. It requires the reversal of the control thread of the program, and it recomputes the derivatives through the adjoint statements beginning with the denition of the nal result, and ending at the independent variables. Usually a special external data structure, the \tape" is used to store the adjoint statements. It is created during the \forward" stage of the program, and then interpreted backwards. We show how to construct purely functionally the equivalent of such a tape, but we present also a more interesting model based on a variant of Wadler's backward propagating State Transformer monad. Our package, written in Haskell, uses the overloading of standard arithmetic opera...
Computational Divided Differencing and DividedDifference Arithmetics
, 2000
"... Tools for computational differentiation transform a program that computes a numerical function F (x) into a related program that computes F 0 (x) (the derivative of F ). This paper describes how techniques similar to those used in computationaldifferentiation tools can be used to implement other pr ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Tools for computational differentiation transform a program that computes a numerical function F (x) into a related program that computes F 0 (x) (the derivative of F ). This paper describes how techniques similar to those used in computationaldifferentiation tools can be used to implement other program transformations  in particular, a variety of transformations for computational divided differencing . The specific technical contributions of the paper are as follows: It presents a program transformation that, given a numerical function F (x) de ned by a program, creates a program that computes F [x0 ; x1 ], the first divided difference of F(x), where F [x0 ; x1 ] def = F (x 0 ) F (x 1 ) x 0 x 1 if x0 6= x1 d dz F (z); evaluated at z = x0 if x0 = x1 It shows how computational first divided differencing generalizes computational differentiation. It presents a second program transformation that permits the creation of higherorder divided differences of a numerical function de ...