Results 1 
4 of
4
On the implementation of automatic differentiation tools
 HigherOrder and Symbolic Computation
, 2004
"... Abstract. Automatic differentiation is a semantic transformation that applies the rules of differential calculus to source code. It thus transforms a computer program that computes a mathematical function into a program that computes the function and its derivatives. Derivatives play an important ro ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
Abstract. Automatic differentiation is a semantic transformation that applies the rules of differential calculus to source code. It thus transforms a computer program that computes a mathematical function into a program that computes the function and its derivatives. Derivatives play an important role in a wide variety of scientific computing applications, including numerical optimization, solution of nonlinear equations, sensitivity analysis, and nonlinear inverse problems. We describe the forward and reverse modes of automatic differentiation and provide a survey of implementation strategies. We describe some of the challenges in the implementation of automatic differentiation tools, with a focus on tools based on source transformation. We conclude with an overview of current research and future opportunities.
Development and first applications of TAC++, in
 Utke (Eds.), Advances in Automatic Differentiation
, 2008
"... Summary. The paper describes the development of the software tool Transformation of Algorithms in C++ (TAC++) for automatic differentiation (AD) of C(++) codes by sourcetosource translation. We have transferred to TAC++ a subset of the algorithms from its wellestablished Fortran equivalent, Transf ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Summary. The paper describes the development of the software tool Transformation of Algorithms in C++ (TAC++) for automatic differentiation (AD) of C(++) codes by sourcetosource translation. We have transferred to TAC++ a subset of the algorithms from its wellestablished Fortran equivalent, Transformation of Algorithms in Fortran (TAF). TAC++ features forward and reverse as well as scalar and vector modes of AD. Efficient higher order derivative code is generated by multiple application of TAC++. High performance of the generated derivate code is demonstrated for five examples from application fields covering remote sensing, computer vision, computational finance, and aeronautics. For instance, the run time of the adjoints for simultaneous evaluation of the function and its gradient is between 1.9 and 3.9 times slower than that of the respective function codes. Options for further enhancement are discussed.
Towards a tool for forward and reverse mode source to source transformation in C++
"... The paper presents a feasibility study for TAC++, the C++equivalent of Transformation of Algorithm in Fortran (TAF). The goal of this study is to design an ADtool capable of generating the adjoint of a simple but nontrivial C code. The purpose of this new tool is threefold: First, we demonstrate ..."
Abstract
 Add to MetaCart
The paper presents a feasibility study for TAC++, the C++equivalent of Transformation of Algorithm in Fortran (TAF). The goal of this study is to design an ADtool capable of generating the adjoint of a simple but nontrivial C code. The purpose of this new tool is threefold: First, we demonstrate the feasibility of reverse mode di#erentiation in C. Second, the tool serves as a starting point for development of TAC++. Third, the tool is valuable in itself, as it can di#erentiate simple C codes and can support handcoders of large adjoint C/C++ codes. We have transfered a subset of TAF algorithms to the new tool, including the E#cient Recomputation Algorithm (ERA). Our test code is a C version of the Roe solver in the CFD code EULSOLDO. For a gradient evaluation, the automatically generated adjoint takes the CPU time of about 3.6 to 4.3 function evaluations.
An Enhanced Markowitz Rule For Accumulating Jacobian Matrices Efficiently
, 2000
"... Jacobian matrices can be accumulated using either the forward or reverse mode of Automatic Differentiation. Alternatively, derivative code can be generated to compute the JacobJan matrix directly at the current argument. The minimization of the corresponding number of arithmetic operations leads to ..."
Abstract
 Add to MetaCart
Jacobian matrices can be accumulated using either the forward or reverse mode of Automatic Differentiation. Alternatively, derivative code can be generated to compute the JacobJan matrix directly at the current argument. The minimization of the corresponding number of arithmetic operations leads to a computationally hard combinatorial optimization problem. A new powerful heuristic for its approximate solution will be presented. The resulting codes lead to a speedup of three and more for most problems.