Results 1  10
of
18
The ADIFOR 2.0 System for the Automatic Differentiation of Fortran 77 Programs
 RICE UNIVERSITY
, 1994
"... Automatic Differentiation is a technique for augmenting computer programs with statements for the computation of derivatives based on the chain rule of differential calculus. The ADIFOR 2.0 system provides automatic differentiation of Fortran 77 programs for firstorder derivatives. The ADIFOR 2.0 s ..."
Abstract

Cited by 55 (17 self)
 Add to MetaCart
Automatic Differentiation is a technique for augmenting computer programs with statements for the computation of derivatives based on the chain rule of differential calculus. The ADIFOR 2.0 system provides automatic differentiation of Fortran 77 programs for firstorder derivatives. The ADIFOR 2.0 system consists of three main components: The ADIFOR 2.0 preprocessor, the ADIntrinsics Fortran 77 exceptionhandling system, and the SparsLinC library. The combination of these tools provides the ability to deal with arbitrary Fortran 77 syntax, to handle codes containing single and doubleprecision real or complexvalued data, to fully support and easily customize the translation of Fortran 77 intrinsics, and to transparently exploit sparsity in derivative computations. ADIFOR 2.0 has been successfully applied to a 60,000line code, which we believe to be a new record in automatic differentiation.
Parallel calculation of sensitivity derivatives for aircraft design using automatic differentiation
 AIAA/NASA/USAF/ISSMO SYMPOSIUM ON MULTIDISCIPLINARY ANALYSIS AND OPTIMIZATION, AIAA 944261
, 1994
"... Sensitivity derivative (SD) calculation via automatic differentiation typical of that required for the aerodynamic design of a transporttype aircraft is considered. Two ways of computing SDs via code generated by the ADIFOR automatic differentiation tool are compared for efficiency and applicabilit ..."
Abstract

Cited by 24 (16 self)
 Add to MetaCart
Sensitivity derivative (SD) calculation via automatic differentiation typical of that required for the aerodynamic design of a transporttype aircraft is considered. Two ways of computing SDs via code generated by the ADIFOR automatic differentiation tool are compared for efficiency and applicability to problems involving large numbers of design variables. A vector implementation on a CRAY YMP computer is compared with a coarsegrained parallel implementation on an IBM SP1 computer, employing a Fortran M wrapper. The SDs are computed for a swept transport wing in turbulent, transonic flow; the number of geometric design variables varies from 1 to 60, with coupling between a wing grid generation program and a stateoftheart, 3D computational fluid dynamics program, both augmented for derivative computation via AD. For a small number of design variables, the CRAY YMP implementation is much faster. As the number of design variables grows, however, the SP1 becomes an attractive alternative in terms of compute speed, job turnaround time, and total memory available for solutions with large numbers of design variables. The coarsegrained parallel implementation also can be moved easily to a network of workstations.
Multidisciplinary Design Optimization Techniques: Implications and Opportunities for Fluid Dynamics Research
 JAROSLAW SOBIESZCZANSKISOBIESKI AND RAPHAEL T. HAFTKA ā€¯MULTIDISCIPLINARY AEROSPACE DESIGN OPTIMIZATION: SURVEY OF RECENT DEVELOPMENTS,ā€¯ 34TH AIAA AEROSPACE SCIENCES MEETING AND EXHIBIT
, 1999
"... A challenge for the fluid dynamics community is to adapt to and exploit the trend towards greater multidisciplinary focus in research and technology. The past decade has witnessed substantial growth in the research field of Multidisciplinary Design Optimization (MDO). MDO is a methodology for the de ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
A challenge for the fluid dynamics community is to adapt to and exploit the trend towards greater multidisciplinary focus in research and technology. The past decade has witnessed substantial growth in the research field of Multidisciplinary Design Optimization (MDO). MDO is a methodology for the design of complex engineering systems and subsystems that coherently exploits the synergism of mutually interacting phenomena. As evidenced by the papers, which appear in the biannual AIAA/USAF/NASA/ISSMO Symposia on Multidisciplinary Analysis and Optimization, the MDO technical community focuses on vehicle and system design issues. This paper provides an overview of the MDO technology field from a fluid dynamics perspective, giving emphasis to suggestions of specific applications of recent MDO technologies that can enhance fluid dynamics research itself across the spectrum, from basic flow physics to full configuration aerodynamics.
Derivative Convergence for Iterative Equation Solvers
, 1993
"... this paper, we consider two approaches to computing the desired implicitly defined derivative x ..."
Abstract

Cited by 20 (13 self)
 Add to MetaCart
this paper, we consider two approaches to computing the desired implicitly defined derivative x
Efficient Derivative Codes Through Automatic Differentiation And Interface Contraction: An Application In Biostatistics
, 1997
"... . Developing code for computing the first and higherorder derivatives of a function by hand can be very timeconsuming and is prone to errors. Automatic differentiation has proven capable of producing derivative codes with very little effort on the part of the user. Automatic differentiation avoid ..."
Abstract

Cited by 16 (10 self)
 Add to MetaCart
. Developing code for computing the first and higherorder derivatives of a function by hand can be very timeconsuming and is prone to errors. Automatic differentiation has proven capable of producing derivative codes with very little effort on the part of the user. Automatic differentiation avoids the truncation errors characteristic of divided difference approximations. However, the derivative code produced by automatic differentiation can be significantly less efficient than one produced by hand. This shortcoming may be overcome by utilizing insight into the highlevel structure of a computation. This paper focuses on how to take advantage of the fact that the number of variables passed between subroutines frequently is small compared with the number of variables with respect to which one wishes to differentiate. Such an "interface contraction," coupled with the associativity of the chain rule for differentiation, allows one to apply automatic differentiation in a more judicious f...
Applications of Automatic Differentiation in CFD
, 1994
"... Automated multidisciplinary design of aircraft requires the optimization of complex performance objectives with respect to a number of design parameters and constraints. The effect of these independent design variables on the system performance criteria can be quantified in terms of sensitivity deri ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
Automated multidisciplinary design of aircraft requires the optimization of complex performance objectives with respect to a number of design parameters and constraints. The effect of these independent design variables on the system performance criteria can be quantified in terms of sensitivity derivatives for the individual discipline simulation codes. Typical advanced CFD codes do not provide such derivatives as part of a flow solution. These derivatives are expensive to obtain by divided differences from perturbed solutions, and may be unreliable, particularly for noisy functions. In this paper, automatic differentiation has been investigated as a means of extending iterative CFD codes with sensitivity derivatives. In particular, the ADIFOR automatic differentiator has been applied to the 3D, thinlayer NavierStokes, multigrid flow solver called TLNS3D coupled with the WTCO wing grid generator. Results of a sequence of efforts in which TLNS3D has been successfully augmented to ...
Application Of Automatic Differentiation To Groundwater Transport Models
, 1994
"... This paper describes the application of automatic differentiation to obtain codes that evaluate derivatives of complex computer models efficiently, exactly, and with a minimum of human effort. Automatic differentiation is a method that produces a derivative code, given the model code and a list of p ..."
Abstract

Cited by 10 (9 self)
 Add to MetaCart
This paper describes the application of automatic differentiation to obtain codes that evaluate derivatives of complex computer models efficiently, exactly, and with a minimum of human effort. Automatic differentiation is a method that produces a derivative code, given the model code and a list of parameters that are considered dependent and independent with respect to differentiation. The method produces a code that will evaluate derivatives exactly (up to machine precision), usually in much less time than the approximate finitedifferences method. There are no inherent limits on program size or complexity. We applied the automatic differentiation tool ADIFOR (Automatic Differentiation in Fortran) to a twodimensional and a threedimensional groundwater flow and contaminant transport finiteelement model to demonstrate the method. The CPU times for automatic differentiation were much faster than for the divideddifferences method for both models, and somewhat slower than handwritten optimized analytic derivative code (written by
Efficient Computation of Gradients and Jacobians by Dynamic Exploitation of Sparsity in Automatic Differentiation
, 1996
"... . Automatic differentiation (AD) is a technique that augments computer codes with statements for the computation of derivatives. The computational workhorse of ADgenerated codes for firstorder derivatives is the linear combination of vectors. For many largescale problems, the vectors involved in ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
. Automatic differentiation (AD) is a technique that augments computer codes with statements for the computation of derivatives. The computational workhorse of ADgenerated codes for firstorder derivatives is the linear combination of vectors. For many largescale problems, the vectors involved in this operation are inherently sparse. If the underlying function is a partially separable one (e.g., if its Hessian is sparse), many of the intermediate gradient vectors computed by AD will also be sparse, even though the final gradient is likely to be dense. For large Jacobians computations, every intermediate derivative vector is usually at least as sparse as the least sparse row of the final Jacobian. In this paper, we show that dynamic exploitation of the sparsity inherent in derivative computation can result in dramatic gains in runtime and memory savings. For a set of gradient problems exhibiting implicit sparsity, we report on the runtime and memory requirements of computing the gradi...
Efficient Computation of Gradients and Jacobians by Transparent Exploitation of Sparsity in Automatic Differentiation
 Optimization Methods and Software
, 1996
"... . Automatic differentiation (AD) is a technique that augments computer codes with statements for the computation of derivatives. The computational workhorse of ADgenerated codes for firstorder derivatives is the linear combination of vectors. For many largescale problems, the vectors involved in ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
. Automatic differentiation (AD) is a technique that augments computer codes with statements for the computation of derivatives. The computational workhorse of ADgenerated codes for firstorder derivatives is the linear combination of vectors. For many largescale problems, the vectors involved in this operation are inherently sparse. If the underlying function is a partially separable one (e.g., if its Hessian is sparse), many of the intermediate gradient vectors computed by AD will also be sparse, even though the final gradient is likely to be dense. For large Jacobians computations, every intermediate derivative vector is usually at least as sparse as the least sparse row of the final Jacobian. In this paper, we show that transparent exploitation of the sparsity inherent in derivative computation can result in dramatic gains in runtime and memory savings. For a set of gradient problems exhibiting implicit sparsity, we report on the runtime and memory requirements of computing the g...
Using TAMC to generate efficient adjoint code: Comparison of automatically generated code for evaluation of first and second order derivatives to hand written code from the Minpack2 collection
"... this document). Other tools operating in reverse mode are employing operator overloading capabilities of C ++ or Fortran90 (ADOLC, AD01, ADOLF, IMAS, OPTIMA90) [3]. TAMC (Tangent linear and Adjoint Model Compiler, [7]) is a sourcetosource translator for Fortran programs to generate derivative ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
this document). Other tools operating in reverse mode are employing operator overloading capabilities of C ++ or Fortran90 (ADOLC, AD01, ADOLF, IMAS, OPTIMA90) [3]. TAMC (Tangent linear and Adjoint Model Compiler, [7]) is a sourcetosource translator for Fortran programs to generate derivative computing code operating in forward or reverse mode. The internal algorithms are based on a few principles suggested e.g. by Talagrand [18]. These principles can be derived from the chain rule of differentiation [8]. TAMC applies a number of analyses and code normalizations similar to those applied by optimizing compilers (constant propagation, index variable substitution, data dependence analysis). In addition, given the toplevel routine to be differentiated and the independent and dependent variables, by applying a forward/reverse data flow analysis TAMC detects all variables that depend on the independent variables and influence the dependent variables (active variables). This is in contrast to operator overloading based tools, where the user has to determine active variables and to declare them to be of a specific data type. TAMC can handle all but very few relevant Fortran 77 statements and an increasing number of Fortran 90 extensions, check the latest manual version on the TAMC home page [6] for the current state of development