Results 1  10
of
25
Snopt: An SQP Algorithm For LargeScale Constrained Optimization
, 1997
"... Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first deriv ..."
Abstract

Cited by 332 (18 self)
 Add to MetaCart
Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available, and that the constraint gradients are sparse.
Fast Contact Force Computation for Nonpenetrating Rigid Bodies
, 1994
"... A new algorithm for computing contact forces between solid objects with friction is presented. The algorithm allows a mix of contact points with static and dynamic friction. In contrast to previous approaches, the problem of computing contact forces is not transformed into an optimization problem. B ..."
Abstract

Cited by 213 (6 self)
 Add to MetaCart
A new algorithm for computing contact forces between solid objects with friction is presented. The algorithm allows a mix of contact points with static and dynamic friction. In contrast to previous approaches, the problem of computing contact forces is not transformed into an optimization problem. Because of this, the need for sophisticated optimization software packages is eliminated. For both systems with and without friction, the algorithm has proven to be considerably faster, simpler, and more reliable than previous approaches to the problem. In particular, implementation of the algorithm by nonspecialists in numerical programming is quite feasible.
ON PROJECTED NEWTON BARRIER METHODS FOR LINEAR PROGRAMMING AND AN EQUIVALENCE TO KARMARKAR'S PROJECTIVE METHOD
, 1986
"... Interest in linear programming has been intensified recently by Karmarkar's publication in 1984 of an algorithm that is claimed to be much faster than the simplex method for practical problems. We review classical barrierfunction methods for nonlinear programming based on applying a logarithmic tra ..."
Abstract

Cited by 67 (8 self)
 Add to MetaCart
Interest in linear programming has been intensified recently by Karmarkar's publication in 1984 of an algorithm that is claimed to be much faster than the simplex method for practical problems. We review classical barrierfunction methods for nonlinear programming based on applying a logarithmic transformation to inequality constraints. For the special case of linear programming, the transformed problem can be solved by a "projected Newton barrier" method. This method is shown to be equivalent to Karmarkar's projective method for a particular choice of the barrier parameter. We then present details of a specific barrier algorithm and its practical implementation. Numerical results are given for several nontrivial test problems, and the implications for future developments in linear programming are discussed.
Interfaces to PATH 3.0: Design, Implementation and Usage
 Computational Optimization and Applications
, 1998
"... Several new interfaces have recently been developed requiring PATH to solve a mixed complementarity problem. To overcome the necessity of maintaining a different version of PATH for each interface, the code was reorganized using objectoriented design techniques. At the same time, robustness issues ..."
Abstract

Cited by 47 (17 self)
 Add to MetaCart
Several new interfaces have recently been developed requiring PATH to solve a mixed complementarity problem. To overcome the necessity of maintaining a different version of PATH for each interface, the code was reorganized using objectoriented design techniques. At the same time, robustness issues were considered and enhancements made to the algorithm. In this paper, we document the external interfaces to the PATH code and describe some of the new utilities using PATH. We then discuss the enhancements made and compare the results obtained from PATH 2.9 to the new version. 1 Introduction The PATH solver [12] for mixed complementarity problems (MCPs) was introduced in 1995 and has since become the standard against which new MCP solvers are compared. However, the main user group for PATH continues to be economists using the MPSGE preprocessor [36]. While developing the new PATH implementation, we had two goals: to make the solver accessible to a broad audience and to improve the effecti...
On the solution of equality constrained quadratic programming problems arising . . .
, 1998
"... ..."
The Semismooth Algorithm for Large Scale Complementarity Problems
, 1999
"... Complementarity solvers are continually being challenged by modelers demanding improved reliability and scalability. Building upon a strong theoretical background, the semismooth algorithm has the potential to meet both of these requirements. We briefly discuss relevant theory associated with th ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
Complementarity solvers are continually being challenged by modelers demanding improved reliability and scalability. Building upon a strong theoretical background, the semismooth algorithm has the potential to meet both of these requirements. We briefly discuss relevant theory associated with the algorithm and describe a sophisticated implementation in detail. Particular emphasis is given to robust methods for dealing with singularities in the linear system and to large scale issues. Results on the MCPLIB test suite indicate that the code is robust and has the potential to solve very large problems.
Numerical Optimal Control Of Parabolic PDEs Using DASOPT
, 1997
"... . This paper gives a preliminary description of DASOPT, a software system for the optimal control of processes described by timedependent partial differential equations (PDEs). DASOPT combines the use of efficient numerical methods for solving differentialalgebraic equations (DAEs) with a package ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
. This paper gives a preliminary description of DASOPT, a software system for the optimal control of processes described by timedependent partial differential equations (PDEs). DASOPT combines the use of efficient numerical methods for solving differentialalgebraic equations (DAEs) with a package for largescale optimization based on sequential quadratic programming (SQP). DASOPT is intended for the computation of the optimal control of timedependent nonlinear systems of PDEs in two (and eventually three) spatial dimensions, including possible inequality constraints on the state variables. By the use of either finitedifference or finiteelement approximations to the spatial derivatives, the PDEs are converted into a large system of ODEs or DAEs. Special techniques are needed in order to solve this very large optimal control problem. The use of DASOPT is illustrated by its application to a nonlinear parabolic PDE boundary control problem in two spatial dimensions. Computational resu...
Global Optimization of Nonconvex Nonlinear Programs Using Parallel Branch and Bound
, 1995
"... A branch and bound algorithm for computing globally optimal solutions to nonconvex nonlinear programs in continuous variables is presented. The algorithm is directly suitable for a wide class of problems arising in chemical engineering design. It can solve problems defined using algebraic functions ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
A branch and bound algorithm for computing globally optimal solutions to nonconvex nonlinear programs in continuous variables is presented. The algorithm is directly suitable for a wide class of problems arising in chemical engineering design. It can solve problems defined using algebraic functions and twice differentiable transcendental functions, in which finite upper and lower bounds can be placed on each variable. The algorithm uses rectangular partitions of the variable domain and a new bounding program based on convex/concave envelopes and positive definite combinations of quadratic terms. The algorithm is deterministic and obtains convergence with final regions of finite size. The partitioning strategy uses a sensitivity analysis of the bounding program to predict the best variable to split and the split location. Two versions of the algorithm are considered, the first using a local NLP algorithm (MINOS) and the second using a sequence of lower bounding programs in the search fo...
A BuildUp InteriorPoint Method for Linear Programming: Affine Scaling Form
, 1991
"... We propose a buildup interiorpoint method for solving an m equation n variable linear program which has the same convergence properties as its well known analogue Dikin's algorithm in dual affine form but may require less computational effort. It differs from Dikin's algorithm in that the ellipsoi ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
We propose a buildup interiorpoint method for solving an m equation n variable linear program which has the same convergence properties as its well known analogue Dikin's algorithm in dual affine form but may require less computational effort. It differs from Dikin's algorithm in that the ellipsoid chosen to generate the improving direction in dual space is constructed from only a subset of the dual constraints. We also present some preliminary computational results for this method. Key words: linear programming, interior method, affine scaling scaling, active column. Abbreviated title: A BuildUp InteriorPoint Method for LP y Department of Operations Research, Stanford University, Stanford, CA 94305 z Department of Management Sciences, The University of Iowa, Iowa City, IA 52242. Research supported in part by NSF Grant DDM8922636 and an Interdisciplinary Research Grant from The University of Iowa Center for Advanced Studies. 1. Steps of Dikin's Algorithm We are concerned with ...
Optimal Sequential Exploration: Bandits, Clairvoyants, and Wildcats. submitted, accessible at http://faculty.fuqua.duke.edu/ jes9/bio/OptimalSequentialExplorationBCW.pdf
, 2012
"... This paper was motivated by the problem of developing an optimal strategy for exploring a large oil and gas field in the North Sea. Where should we drill first? Where do we drill next? The problem resembles a classical multiarmed bandit problem, but probabilistic dependence plays a key role: outcome ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
This paper was motivated by the problem of developing an optimal strategy for exploring a large oil and gas field in the North Sea. Where should we drill first? Where do we drill next? The problem resembles a classical multiarmed bandit problem, but probabilistic dependence plays a key role: outcomes at drilled sites reveal information about neighboring targets. Good exploration strategies will take advantage of this information as it is revealed. We develop heuristic policies for sequential exploration problems and complement these heuristics with upper bounds on the performance of an optimal policy. We begin by grouping the targets into clusters of manageable size. The heuristics are derived from a model that treats these clusters as independent. The upper bounds are given by assuming each cluster has perfect information about the results from all other clusters. The analysis relies heavily on results for bandit superprocesses, a generalization of the classical multiarmed bandit problem. We evaluate the heuristics and bounds using Monte Carlo simulation and, in our problem, we find that the heuristic policies are nearly optimal.