Results 1  10
of
58
Recipes for Adjoint Code Construction
"... this paper, is the Adjoint Model Compiler (AMC). ..."
Abstract

Cited by 222 (22 self)
 Add to MetaCart
this paper, is the Adjoint Model Compiler (AMC).
Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods
 SIAM REVIEW VOL. 45, NO. 3, PP. 385–482
, 2003
"... Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because they lacked ..."
Abstract

Cited by 198 (14 self)
 Add to MetaCart
(Show Context)
Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because they lacked coherent mathematical analysis. Nonetheless, users remained loyal to these methods, most of which were easy to program, some of which were reliable. In the past fifteen years, these methods have seen a revival due, in part, to the appearance of mathematical analysis, as well as to interest in parallel and distributed computing. This review begins by briefly summarizing the history of direct search methods and considering the special properties of problems for which they are well suited. Our focus then turns to a broad class of methods for which we provide a unifying framework that lends itself to a variety of convergence results. The underlying principles allow generalization to handle bound constraints and linear constraints. We also discuss extensions to problems with nonlinear constraints.
NEOS and CONDOR: Solving Optimization Problems over the Internet
 ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE
, 1998
"... We discuss the use of Condor, a distributed resource management system, as a provider of computational resources for NEOS, an environment for solving optimization problems over the Internet. We also describe how problems are submitted and processed by NEOS, and then scheduled and solved by Condor ..."
Abstract

Cited by 43 (1 self)
 Add to MetaCart
We discuss the use of Condor, a distributed resource management system, as a provider of computational resources for NEOS, an environment for solving optimization problems over the Internet. We also describe how problems are submitted and processed by NEOS, and then scheduled and solved by Condor on available (idle) workstations.
Mor e, Computing gradients in largescale optimization using automatic dierentiation
 INFORMS J. Computing
, 1997
"... The accurate and ecient computation of gradients for partially separable functions is central to the solution of largescale optimization problems, since these functions are ubiquitous in largescale problems. We describe two approaches for computing gradients of partially separable functions via au ..."
Abstract

Cited by 31 (11 self)
 Add to MetaCart
The accurate and ecient computation of gradients for partially separable functions is central to the solution of largescale optimization problems, since these functions are ubiquitous in largescale problems. We describe two approaches for computing gradients of partially separable functions via automatic dierentiation. In our experiments we employ the ADIFOR (Automatic Dierentiation of Fortran) tool and the SparsLinC (Sparse Linear Combination) library. We use applications from the MINPACK2 test problem collection to compare the numerical reliability and computational eciency of these approaches with handcoded derivatives and approximations based on dierences of function values. Our conclusion is that automatic dierentiation is the method of choice, providing code for the ecient computation of the gradient without the need for tedious handcoding. The solution of nonlinear optimization problems often requires the computation of the gradient rf 0 of a mapping f
Optimization With VariableFidelity Models Applied To Wing Design
, 2000
"... This work discusses an approach, the Approximation Management Framework (AMF), for solving optimization problems that involve computationally expensive simulations. AMF aims to maximize the use of lowerfidelity, cheaper models in iterative procedures with occasional, but systematic, recourse to high ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
This work discusses an approach, the Approximation Management Framework (AMF), for solving optimization problems that involve computationally expensive simulations. AMF aims to maximize the use of lowerfidelity, cheaper models in iterative procedures with occasional, but systematic, recourse to higherfidelity, more expensive models for monitoring the progress of the algorithm. The method is globally convergent to a solution of the original, highfidelity problem. Three versions of AMF, based on three nonlinear programming algorithms, are demonstrated on a 3D aerodynamic wing optimization problem and a 2D airfoil optimization problem. In both cases Euler analysis solved on meshes of various refinement provides a suite of variablefidelity models. Preliminary results indicate threefold savings in terms of highfidelity analyses in case of the 3D problem and twofold savings for the 2D problem. Key Words: Approximation concepts, approximation management, model management, surrogate optimi...
More AD of Nonlinear AMPL Models: Computing Hessian Information and Exploiting Partial Separability
 in Computational Differentiation: Applications, Techniques, and
, 1996
"... We describe computational experience with automatic differentiation of mathematical programming problems expressed in the modeling language AMPL. Nonlinear expressions are translated to loopfree code, which makes it easy to compute gradients and Jacobians by backward automatic differentiation. ..."
Abstract

Cited by 24 (12 self)
 Add to MetaCart
(Show Context)
We describe computational experience with automatic differentiation of mathematical programming problems expressed in the modeling language AMPL. Nonlinear expressions are translated to loopfree code, which makes it easy to compute gradients and Jacobians by backward automatic differentiation. The nonlinear expressions may be interpreted or, to gain some evaluation speed at the cost of increased preparation time, converted to Fortran or C. We have extended the interpretive scheme to evaluate Hessian (of Lagrangian) times vector. Detecting partially separable structure (sums of terms, each depending, perhaps after a linear transformation, on only a few variables) is of independent interest, as some solvers exploit this structure. It can be detected automatically by suitable "tree walks". Exploiting this structure permits an AD computation of the entire Hessian matrix by accumulating Hessian times vector computations for each term, and can lead to a much faster computation...
To be recorded” analysis in reversemode automatic differentiation
 Future Generation Computer Systems
"... The automatic generation of adjoints of mathematical models that are implemented as computer programs is receiving increased attention in the scientific and engineering communities. Reversemode automatic differentiation is of particular interest for largescale optimization problems. It allows the ..."
Abstract

Cited by 22 (9 self)
 Add to MetaCart
(Show Context)
The automatic generation of adjoints of mathematical models that are implemented as computer programs is receiving increased attention in the scientific and engineering communities. Reversemode automatic differentiation is of particular interest for largescale optimization problems. It allows the computation of gradients at a small constant multiple of the cost for evaluating the objective function itself, independent of the number of input parameters. Sourcetosource transformation tools apply simple differentiation rules to generate adjoint codes based on the adjoint version of every statement. In order to guarantee correctness, certain values that are computed and overwritten in the original program must be made available in the adjoint program. For their determination we introduce a static dataflow analysis called “to be recorded ” analysis. Possible overestimation of this set must be kept minimal to get efficient adjoint codes. This efficiency is essential for the applicability of sourcetosource transformation tools to realworld applications. 1 Automatically Generated Adjoints We consider a computer program P evaluating a vector function y = F (x), where F: IR n → IR m. Usually, P implements the mathematical model of some underlying realworld application and it is referred to as the original code. The goal of automatic differentiation (AD) [3,7,15] by source transformation is to build automatically a new source program P ′ evaluating some derivatives of F. This is arrow AD in Figure 1. We consider a simplified mathematical model, symbolized by arrow R in Figure 1: Every particular run of P on a particular set of inputs is equivalent to a simple sequence of p scalar assignments vj = ϕj(vk)k≺j, j = 1,..., q, (1)
Sensitivity Analysis For Atmospheric Chemistry Models Via Automatic Differentiation
 Environ
, 1997
"... . Automatic differentiation techniques are used in the sensitivity analysis of a comprehensive atmospheric chemical mechanism. Specifically, ADIFOR software is used to calculate the sensitivity of ozone with respect to all initial concentrations (of 84 species) and all reaction rate constants (178 c ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
. Automatic differentiation techniques are used in the sensitivity analysis of a comprehensive atmospheric chemical mechanism. Specifically, ADIFOR software is used to calculate the sensitivity of ozone with respect to all initial concentrations (of 84 species) and all reaction rate constants (178 chemical reactions) for six different chemical regions. Numerical aspects of the application of ADIFOR are also presented. Automatic differentiation is shown to be a powerful tool for sensitivity analysis. Key words. Atmospheric chemistry, Automatic Differentiation. 1. Introduction. Comprehensive sensitivity analysis of air pollution models remains the exception rather than the rule. Most sensitivity analysis applied to air pollution modeling studies has focused on the calculation of the local sensitivities (first order derivatives of output variables with respect to model parameters) in box model studies of gas phase chemistry (see [Cho86]). The most common form of sensitivity studies with ...
Efficient Derivative Codes Through Automatic Differentiation And Interface Contraction: An Application In Biostatistics
, 1997
"... . Developing code for computing the first and higherorder derivatives of a function by hand can be very timeconsuming and is prone to errors. Automatic differentiation has proven capable of producing derivative codes with very little effort on the part of the user. Automatic differentiation avoid ..."
Abstract

Cited by 17 (10 self)
 Add to MetaCart
. Developing code for computing the first and higherorder derivatives of a function by hand can be very timeconsuming and is prone to errors. Automatic differentiation has proven capable of producing derivative codes with very little effort on the part of the user. Automatic differentiation avoids the truncation errors characteristic of divided difference approximations. However, the derivative code produced by automatic differentiation can be significantly less efficient than one produced by hand. This shortcoming may be overcome by utilizing insight into the highlevel structure of a computation. This paper focuses on how to take advantage of the fact that the number of variables passed between subroutines frequently is small compared with the number of variables with respect to which one wishes to differentiate. Such an "interface contraction," coupled with the associativity of the chain rule for differentiation, allows one to apply automatic differentiation in a more judicious f...
A differentiationenabled Fortran 95 compiler
 CODEN ACMSCU. ISSN 00983500 (print), 15577295 (electronic). 215 Tang:2005:DNI
, 2005
"... ..."