Results 1  10
of
39
On Augmented Lagrangian methods with general lowerlevel constraints
, 2005
"... Augmented Lagrangian methods with general lowerlevel constraints are considered in the present research. These methods are useful when efficient algorithms exist for solving subproblems where the constraints are only of the lowerlevel type. Two methods of this class are introduced and analyzed. In ..."
Abstract

Cited by 55 (6 self)
 Add to MetaCart
(Show Context)
Augmented Lagrangian methods with general lowerlevel constraints are considered in the present research. These methods are useful when efficient algorithms exist for solving subproblems where the constraints are only of the lowerlevel type. Two methods of this class are introduced and analyzed. Inexact resolution of the lowerlevel constrained subproblems is considered. Global convergence is proved using the Constant Positive Linear Dependence constraint qualification. Conditions for boundedness of the penalty parameters are discussed. The reliability of the approach is tested by means of an exhaustive comparison against Lancelot. All the problems of the Cute collection are used in this comparison. Moreover, the resolution of location problems in which many constraints of the lowerlevel set are nonlinear is addressed, employing the Spectral Projected Gradient method for solving the subproblems. Problems of this type with more than 3 × 10 6 variables and 14 × 10 6 constraints are solved in this way, using moderate computer time.
KNITRO: An integrated package for nonlinear optimization
 Large Scale Nonlinear Optimization, 35–59, 2006
, 2006
"... This paper describes Knitro 5.0, a Cpackage for nonlinear optimization that combines complementary approaches to nonlinear optimization to achieve robust performance over a wide range of application requirements. The package is designed for solving largescale, smooth nonlinear programming problems ..."
Abstract

Cited by 52 (3 self)
 Add to MetaCart
(Show Context)
This paper describes Knitro 5.0, a Cpackage for nonlinear optimization that combines complementary approaches to nonlinear optimization to achieve robust performance over a wide range of application requirements. The package is designed for solving largescale, smooth nonlinear programming problems, and it is also effective for the following special cases: unconstrained optimization, nonlinear systems of equations, least squares, and linear and quadratic programming. Various algorithmic options are available, including two interior methods and an activeset method. The package provides crossover techniques between algorithmic options as well as automatic selection of options and settings. 1
A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation
 SIAM Journal on Scientific Computing
, 2010
"... Abstract. We propose a fast algorithm for solving the ℓ1regularized minimization problem minx∈R n µ‖x‖1 + ‖Ax − b ‖ 2 2 for recovering sparse solutions to an undetermined system of linear equations Ax = b. The algorithm is divided into two stages that are performed repeatedly. In the first stage a ..."
Abstract

Cited by 29 (7 self)
 Add to MetaCart
(Show Context)
Abstract. We propose a fast algorithm for solving the ℓ1regularized minimization problem minx∈R n µ‖x‖1 + ‖Ax − b ‖ 2 2 for recovering sparse solutions to an undetermined system of linear equations Ax = b. The algorithm is divided into two stages that are performed repeatedly. In the first stage a firstorder iterative method called “shrinkage ” yields an estimate of the subset of components of x likely to be nonzero in an optimal solution. Restricting the decision variables x to this subset and fixing their signs at their current values reduces the ℓ1norm ‖x‖1 to a linear function of x. The resulting subspace problem, which involves the minimization of a smaller and smooth quadratic function, is solved in the second phase. Our code FPC AS embeds this basic twostage algorithm in a continuation (homotopy) approach by assigning a decreasing sequence of values to µ. This code exhibits stateoftheart performance both in terms of its speed and its ability to recover sparse signals. It can even recover signals that are not as sparse as required by current compressive sensing theory.
Steering Exact Penalty Methods for Nonlinear Programming
, 2007
"... This paper reviews, extends and analyzes a new class of penalty methods for nonlinear optimization. These methods adjust the penalty parameter dynamically; by controlling the degree of linear feasibility achieved at every iteration, they promote balanced progress toward optimality and feasibility. I ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
This paper reviews, extends and analyzes a new class of penalty methods for nonlinear optimization. These methods adjust the penalty parameter dynamically; by controlling the degree of linear feasibility achieved at every iteration, they promote balanced progress toward optimality and feasibility. In contrast with classical approaches, the choice of the penalty parameter ceases to be a heuristic and is determined, instead, by a subproblem with clearly defined objectives. The new penalty update strategy is presented in the context of sequential quadratic programming (SQP) and sequential linearquadratic programming (SLQP) methods that use trust regions to promote convergence. The paper concludes with a discussion of penalty parameters for merit functions used in line search methods.
A Sequential Quadratic Programming Algorithm with an Additional Equality Constrained Phase
, 2008
"... A sequential quadratic programming (SQP) method is presented that aims to overcome some of the drawbacks of contemporary SQP methods. It avoids the difficulties associated with indefinite quadratic programming subproblems by defining this subproblem to be always convex. The novel feature of the appr ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
A sequential quadratic programming (SQP) method is presented that aims to overcome some of the drawbacks of contemporary SQP methods. It avoids the difficulties associated with indefinite quadratic programming subproblems by defining this subproblem to be always convex. The novel feature of the approach is the addition of an equality constrained phase that promotes fast convergence and improves performance in the presence of ill conditioning. This equality constrained phase uses exact second order information and can be implemented using either a direct solve or an iterative method. The paper studies the global and local convergence properties of the new algorithm and presents a set of numerical experiments to illustrate its practical performance.
Nonlinear programming techniques for operative planning in large drinking water networks
, 2005
"... Mathematical decision support for operative planning in water supply systems is highly desirable but leads to very difficult optimization problems. We propose a nonlinear programming approach that yields practically satisfactory operating schedules in acceptable computing time even for large networ ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
Mathematical decision support for operative planning in water supply systems is highly desirable but leads to very difficult optimization problems. We propose a nonlinear programming approach that yields practically satisfactory operating schedules in acceptable computing time even for large networks. Based on a carefully designed model supporting gradientbased optimization algorithms, this approach employs a special initialization strategy for convergence acceleration, special minimum up and down time constraints together with pump aggregation to handle switching decisions, and several network reduction techniques for further speedup. Results for selected application scenarios at Berliner Wasserbetriebe demonstrate the success of the approach.
Improving ultimate convergence of an Augmented Lagrangian method
, 2007
"... Optimization methods that employ the classical PowellHestenesRockafellar Augmented Lagrangian are useful tools for solving Nonlinear Programming problems. Their reputation decreased in the last ten years due to the comparative success of InteriorPoint Newtonian algorithms, which are asymptoticall ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Optimization methods that employ the classical PowellHestenesRockafellar Augmented Lagrangian are useful tools for solving Nonlinear Programming problems. Their reputation decreased in the last ten years due to the comparative success of InteriorPoint Newtonian algorithms, which are asymptotically faster. In the present research a combination of both approaches is evaluated. The idea is to produce a competitive method, being more robust and efficient than its “pure” counterparts for critical problems. Moreover, an additional hybrid algorithm is defined, in which the Interior Point method is replaced by the Newtonian resolution of a KKT system identified by the Augmented Lagrangian algorithm. The software used in this work is freely available through the Tango Project web page:
Infeasibility Detection and SQP Methods for Nonlinear Optimization
, 2008
"... This paper addresses the need for nonlinear programming algorithms that provide fast local convergence guarantees no matter if a problem is feasible or infeasible. We present an activeset sequential quadratic programming method derived from an exact penalty approach that adjusts the penalty paramet ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
This paper addresses the need for nonlinear programming algorithms that provide fast local convergence guarantees no matter if a problem is feasible or infeasible. We present an activeset sequential quadratic programming method derived from an exact penalty approach that adjusts the penalty parameter appropriately to emphasize optimality over feasibility, or vice versa. Conditions are presented under which superlinear convergence is achieved in the infeasible case. Numerical experiments illustrate the practical behavior of the method.
An Algorithmic Framework for GenomeWide Modeling and Analysis of Translation Networks
, 2006
"... ABSTRACT The sequencing of genomes of several organisms and advances in high throughput technologies for transcriptome and proteome analysis has allowed detailed mechanistic studies of transcription and translation using mathematical frameworks that allow integration of both sequencespecific and ki ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
ABSTRACT The sequencing of genomes of several organisms and advances in high throughput technologies for transcriptome and proteome analysis has allowed detailed mechanistic studies of transcription and translation using mathematical frameworks that allow integration of both sequencespecific and kinetic properties of these fundamental cellular processes. To understand how perturbations in mRNA levels affect the synthesis of individual proteins within a large protein synthesis network, we consider here a genomescale codonwide model of the translation machinery with explicit description of the processes of initiation, elongation, and termination. The mechanistic codonwide description of the translation process and the large number of mRNAs competing for resources, such as ribosomes, requires the use of novel efficient algorithmic approaches. We have developed such an efficient algorithmic framework for genomescale models of protein synthesis. The mathematical and computational framework was applied to the analysis of the sensitivity of a translation network to perturbation in the rate constants and in the mRNA levels in the system. Our studies suggest that the highest specific protein synthesis rate (protein synthesis rate per mRNA molecule) is achieved when translation is elongationlimited. We find that the mRNA species with the highest number of actively translating ribosomes exerts maximum control on the synthesis of every protein, and the response of protein synthesis rates to mRNA expression variation is a function of the strength of initiation of translation at different mRNA species. Such quantitative understanding of the sensitivity of protein synthesis to the variation of mRNA expression can provide insights into cellular robustness mechanisms and guide the design of protein production systems.