Results 1  10
of
13
Hooking Your Solver to AMPL
, 1997
"... This report tells how to make solvers work with AMPL's solve command. It describes an interface library, amplsolver.a, whose source is available from netlib. Examples include programs for listing LPs, automatic conversion to the LP dual (shellscript as solver), solvers for various nonlinear probl ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
This report tells how to make solvers work with AMPL's solve command. It describes an interface library, amplsolver.a, whose source is available from netlib. Examples include programs for listing LPs, automatic conversion to the LP dual (shellscript as solver), solvers for various nonlinear problems (with first and sometimes second derivatives computed by automatic differentiation), and getting C or Fortran 77 for nonlinear constraints, objectives and their first derivatives. Drivers for various well known linear, mixedinteger, and nonlinear solvers provide more examples.
Second Order Information in Data Assimilation
, 2000
"... In variational data assimilation (VDA) for meteorological and/or oceanic models, the assimilated fields are deduced by combining the model and the gradient of a cost functional measuring discrepancy between model solution and observation, via a first order optimality system. However existence and un ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
In variational data assimilation (VDA) for meteorological and/or oceanic models, the assimilated fields are deduced by combining the model and the gradient of a cost functional measuring discrepancy between model solution and observation, via a first order optimality system. However existence and uniqueness of the VDA problem along with convergence of the algorithms for its implementation depend on the convexity of the cost function. Properties of local convexity can be deduced by studying the Hessian of the cost function in the vicinity of the optimum thus the necessity of second order information to ensure a unique solution to the VDA problem. In this paper we present a comprehensive review of issues related to second order analysis of the problem of VDA along with many important issues closely connected to it. In particular we study issues of existence, uniqueness and regularization through second order properties. We then focus on second order information related to statistical properties and on issues related to preconditioning and optimization methods and second order VDA analysis. Predictability and its relation to the structure of the Hessian of the cost functional is then discussed along with issues of sensitivity analysis in the presence of data being assimilated. Computational complexity issues are also addressed and discussed. Automatic differentiation issues related to second order information are also discussed along with the computational complexity of deriving the second order adjoint. Finally
Extending an algebraic modeling language to support constraint programming
 INFORMS Journal on Computing
, 2001
"... Abstract. Although algebraic modeling languages are widely used in linear and nonlinear programming applications, their use for combinatorial or discrete optimization has largely been limited to developing integer linear programming models for solution by generalpurpose branchandbound procedures. ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Abstract. Although algebraic modeling languages are widely used in linear and nonlinear programming applications, their use for combinatorial or discrete optimization has largely been limited to developing integer linear programming models for solution by generalpurpose branchandbound procedures. Yet much of a modeling language’s underlying structure for expressing integer programs is equally useful for describing more general combinatorial optimization constructs. Constraint programming solvers offer an alternative approach to solving combinatorial optimization problems, in which natural combinatorial constructs are addressed directly within the solution procedure. Hence the growing popularity of constraint programming motivates a variety of extensions to algebraic modeling languages for the purpose of describing combinatorial problems and conveying them to solvers. We examine some of these language extensions along with the significant changes in solver interface design that they require. In particular, we describe how several useful combinatorial features have been added to the AMPL modeling language and how AMPL’s generalpurpose solver interface has been adapted accordingly. As an illustration of a solver connection, we provide examples from an AMPL driver for ILOG Solver. This work has been supported in part by Bell Laboratories and by grants DMI9414487
SecondOrder Information in Data Assimilation
, 2002
"... In variational data assimilation (VDA) for meteorological and/or oceanic models, the assimilated fields are deduced by combining the model and the gradient of a cost functional measuring discrepancy between model solution and observation, via a firstorder optimality system. However, existence and ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
In variational data assimilation (VDA) for meteorological and/or oceanic models, the assimilated fields are deduced by combining the model and the gradient of a cost functional measuring discrepancy between model solution and observation, via a firstorder optimality system. However, existence and uniqueness of the VDA problem along with convergence of the algorithms for its implementation depend on the convexity of the cost function. Properties of local convexity can be deduced by studying the Hessian of the cost function in the vicinity of the optimum. This shows the necessity of secondorder information to ensure a unique solution to the VDA problem.
Semiautomatic Differentiation for Efficient Gradient Computations
"... This paper concerns ongoing work; it compares several implementations of backward AD, describes a simple operatoroverloading implementation specialized for gradient computations, and compares the implementations on some meshoptimization examples. Ideas from the specialized implementation could be ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
This paper concerns ongoing work; it compares several implementations of backward AD, describes a simple operatoroverloading implementation specialized for gradient computations, and compares the implementations on some meshoptimization examples. Ideas from the specialized implementation could be used in fully general sourcetosource translators for C and C++
Large Scale Unconstrained Optimization
 The State of the Art in Numerical Analysis
, 1996
"... This paper reviews advances in Newton, quasiNewton and conjugate gradient methods for large scale optimization. It also describes several packages developed during the last ten years, and illustrates their performance on some practical problems. Much attention is given to the concept of partial ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
This paper reviews advances in Newton, quasiNewton and conjugate gradient methods for large scale optimization. It also describes several packages developed during the last ten years, and illustrates their performance on some practical problems. Much attention is given to the concept of partial separabilitywhich is gaining importance with the arrival of automatic differentiation tools and of optimization software that fully exploits its properties.
SymbolicAlgebraic Computations in a Modeling Language for Mathematical Programming
, 2000
"... This paper was written for the proceedings of a seminar on "Symbolicalgebraic ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper was written for the proceedings of a seminar on "Symbolicalgebraic
Automatically finding and exploiting partially separable structure in nonlinear programming problems,” Bell Laboratories
, 1996
"... Nonlinear programming problems often involve an objective and constraints that are partially separable — the sum of terms involving only a few variables (perhaps after a linear change of variables). This paper discusses finding and exploiting such structure in nonlinear programming problems expresse ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Nonlinear programming problems often involve an objective and constraints that are partially separable — the sum of terms involving only a few variables (perhaps after a linear change of variables). This paper discusses finding and exploiting such structure in nonlinear programming problems expressed symbolically in the AMPL modeling language. For some computations, such as computing Hessians by backwards automatic differentiation, exploiting partial separability can give significant speedups. Overview To set the context for this paper, it is necessary to talk about various aspects of nonlinear programming problems and automatic differentiation. Accordingly, it is convenient to begin with brief overviews of Newton’s method, nonlinear programming, and automatic differentiation. Since I report computational experience with problems expressed symbolically in the AMPL modeling language, a brief account of AMPL is also appropriate. In the initial overviews, I will omit most references. Newton’s Method for Nonlinear Equations Newton’s method is in some ways an ideal algorithm for solving systems of nonlinear equations. It is easily derived by a linearization argument, and it converges quickly when started close to a ‘‘strong’ ’ solution. As a simple example, Table 1 shows the sequence of residual errors for a squareroot iteration; note how the residuals are approximately squared in successive iterations (‘‘quadratic convergence’’). In more detail, if f: I R n →I R n is a differentiable mapping of real nspace to itself, then f (x + y) ∼ f (x) + f ′(x) (y − x), where f ′(x) is the Jacobian matrix of f at x, so if f ′(x) is nonsingular and y = x − f ′(x) − 1 f (x), then f (y) ∼ 0, which gives Newton’s method: (1) x k + 1 = x k − f ′(x k) − 1 f (x k). To carry out a step of Newton’s method, it is of course not necessary to explicitly form f ′(x k) − 1; rather it suffices to solve
Conveying Problem Structure from an Algebraic Modeling Language to Optimization Algorithms
"... : Optimization algorithms can exploit problem structures of various kinds, such as sparsity of derivatives, complementarity conditions, block structure, stochasticity, priorities for discrete variables, and information about piecewiselinear terms. Moreover, some algorithms deduce additional structur ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
: Optimization algorithms can exploit problem structures of various kinds, such as sparsity of derivatives, complementarity conditions, block structure, stochasticity, priorities for discrete variables, and information about piecewiselinear terms. Moreover, some algorithms deduce additional structural information that may help the modeler. We review and discuss some ways of conveying structure, with examples from our designs for the AMPL modeling language. We show in particular how "declared suffixes" provide a new and useful way to express structure and acquire solution information. 1. INTRODUCTION A modeling language can provide a useful way to express the elaborate optimization problems that often arise in practice. Many of these problems have structure that an optimization algorithm can exploit, such as sparsity of first and second derivatives, complementarity conditions, block structure, timedependent stochasticity, priorities for discrete variables, and information about piecew...
DESIGN PRINCIPLES AND NEW DEVELOPMENTS IN THE AMPL MODELING LANGUAGE
"... The design of the AMPL modeling language stresses naturalness of expressions, generality of iterating over sets, separation of model and data, ease of data manipulation, and automatic updating of derived values when fundamental values change. We show how such principles have guided the addition of d ..."
Abstract
 Add to MetaCart
The design of the AMPL modeling language stresses naturalness of expressions, generality of iterating over sets, separation of model and data, ease of data manipulation, and automatic updating of derived values when fundamental values change. We show how such principles have guided the addition of database access, complementarity modeling, and other language features.