Results 1  10
of
17
Constraint partitioning in penalty formulations for solving temporal planning problems
 Artificial Intelligence
, 2006
"... Abstract In this paper, we study the partitioning of constraints in temporal planning problems formulated as mixedinteger nonlinear programming (MINLP) problems. Constraint partitioning is ..."
Abstract

Cited by 17 (12 self)
 Add to MetaCart
(Show Context)
Abstract In this paper, we study the partitioning of constraints in temporal planning problems formulated as mixedinteger nonlinear programming (MINLP) problems. Constraint partitioning is
Global Optimization of MINLP Problems in Process Synthesis and Design
 Computers & Chemical Engineering
, 1997
"... : Two new methodologies for the global optimization of MINLP models, the Special structure Mixed Integer Nonlinear ffBB, SMINffBB, and the General structure Mixed Integer Nonlinear ffBB, GMINffBB, are presented. Their theoretical foundations provide guarantees that the global optimum solution of ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
(Show Context)
: Two new methodologies for the global optimization of MINLP models, the Special structure Mixed Integer Nonlinear ffBB, SMINffBB, and the General structure Mixed Integer Nonlinear ffBB, GMINffBB, are presented. Their theoretical foundations provide guarantees that the global optimum solution of MINLPs involving twicedifferentiable nonconvex functions in the continuous variables can be identified. The conditions imposed on the functionality of the binary variables differ for each method : linear and mixed bilinear terms can be treated with the SMINffBB; mixed nonlinear terms whose continuous relaxation is twicedifferentiable are handled by the GMINffBB. While both algorithms use the concept of a branch & bound tree, they rely on fundamentally different bounding and branching strategies. In the GMINffBB algorithm, lower (upper) bounds at each node result from the solution of convex (nonconvex) MINLPs derived from the original problem. The construction of convex lower bound...
Global Optimization For Constrained Nonlinear Programming
, 2001
"... In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM dn ) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
(Show Context)
In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM dn ) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary and sufficient condition for constrained local minima (CLM dn ) in the theory of discrete constrained optimization using Lagrange multipliers developed in our group. The theory proves the equivalence between the set of discrete saddle points and the set of CLM dn, leading to the firstorder necessary and sufficient condition for CLM dn. To find
Solving largescale nonlinear programming problems by constraint partitioning
 In Proc. Principles and Practice of Constraint Programming, LCNS3709
, 2005
"... Abstract. In this paper, we present a constraintpartitioning approach for finding local optimal solutions of largescale mixedinteger nonlinear programming problems (MINLPs). Based on our observation that MINLPs in many engineering applications have highly structured constraints, we propose to par ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper, we present a constraintpartitioning approach for finding local optimal solutions of largescale mixedinteger nonlinear programming problems (MINLPs). Based on our observation that MINLPs in many engineering applications have highly structured constraints, we propose to partition these MINLPs by their constraints into subproblems, solve each subproblem by an existing solver, and resolve those violated global constraints across the subproblems using our theory of extended saddle points. Constraint partitioning allows many MINLPs that cannot be solved by existing solvers to be solvable because it leads to easier subproblems that are significant relaxations of the original problem. The success of our approach relies on our ability to resolve violated global constraints efficiently, without requiring exhaustive enumerations of variable values in these constraints. We have developed an algorithm for automatically partitioning a large MINLP in order to minimize the number of global constraints, an iterative method for determining the optimal number of partitions in order to minimize the search time, and an efficient strategy for resolving violated global constraints. Our experimental results demonstrate significant improvements over the best existing solvers in terms of solution time and quality in solving a collection of mixedinteger and continuous nonlinear constrained optimization benchmarks. 1
THE DISCRETE LAGRANGIAN THEORY AND ITS APPLICATION TO SOLVE NONLINEAR DISCRETE CONSTRAINED OPTIMIZATION PROBLEMS
, 1998
"... In this research we present new results on discrete Lagrangian methods (DLM) and extend our previous (incomplete and highly simplified) theory on the method. Our proposed method forms a strong mathematical foundation for solving general nonlinear discrete optimization problems. Specifically, we show ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
In this research we present new results on discrete Lagrangian methods (DLM) and extend our previous (incomplete and highly simplified) theory on the method. Our proposed method forms a strong mathematical foundation for solving general nonlinear discrete optimization problems. Specifically, we show for continuous Lagrangian methods the relationship among local minimal solutions satisfying constraints, solutions found by the firstorder necessary and secondorder sufficient conditions, and saddle points. Since there is no corresponding definition of gradients in discrete space, we propose a new vectorbased definition of gradient, develop firstorder conditions similar to those in continuous space, propose a heuristic method to find saddle points, and show the relationship between saddle points and local minimal solutions satisfying constraints. We then show, when all the constraint functions are nonnegative, that the set of saddle points is the same as the set of local minimal points satisfying constraints. Our formal results for solving discrete
MixedInteger Nonlinear Optimization in Process Synthesis
, 1998
"... The use of networks allows the representation of a variety of important engineering problems. The treatment of a particular class of network applications, the process synthesis problem, is exposed in this paper. Process Synthesis seeks to develop systematically process flowsheets that convert raw ma ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
The use of networks allows the representation of a variety of important engineering problems. The treatment of a particular class of network applications, the process synthesis problem, is exposed in this paper. Process Synthesis seeks to develop systematically process flowsheets that convert raw materials into desired products. In recent years, the optimization approach to process synthesis has shown promise in tackling this challenge. It requires the development of a network of interconnected units, the process superstructure, that represents the alternative process flowsheets. The mathematical modeling of the superstructure has a mixed set of binary and continuous variables and results in a mixedinteger optimization model. Due to the nonlinearity of chemical models, these problems are generally classified as MixedInteger Nonlinear Programming (MINLP) problems. A number of local optimization algorithms, developed for the solution of this class of problems, are presented in this pap...
Optimal Anytime Search For Constrained Nonlinear Programming
, 2001
"... In this thesis, we study optimal anytime stochastic search algorithms (SSAs) for solving general constrained nonlinear programming problems (NLPs) in discrete, continuous and mixedinteger space. The algorithms are general in the sense that they do not assume di#erentiability or convexity of functio ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
In this thesis, we study optimal anytime stochastic search algorithms (SSAs) for solving general constrained nonlinear programming problems (NLPs) in discrete, continuous and mixedinteger space. The algorithms are general in the sense that they do not assume di#erentiability or convexity of functions. Based on the search algorithms, we develop the theory of SSAs and propose optimal SSAs with iterative deepening in order to minimize their expected search time. Based on the optimal SSAs, we then develop optimal anytime SSAs that generate improved solutions as more search time is allowed. Our SSAs
Solving Nonlinear Constrained Optimization Problems Through Constraint Partitioning
, 2005
"... In this dissertation, we propose a general approach that can significantly reduce the complexity in solving discrete, continuous, and mixed constrained nonlinear optimization (NLP) problems. A key observation we have made is that most applicationbased NLPs have structured arrangements of constrai ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
In this dissertation, we propose a general approach that can significantly reduce the complexity in solving discrete, continuous, and mixed constrained nonlinear optimization (NLP) problems. A key observation we have made is that most applicationbased NLPs have structured arrangements of constraints. For example, constraints in AI planning are often localized into coherent groups based on their corresponding subgoals. In engineering design problems, such as the design of a power plant, most constraints exhibit a spatial structure based on the layout of the physical components. In optimal control applications, constraints are localized by stages or time. We have developed techniques to exploit these constraint structures by partitioning the constraints into subproblems related by global constraints. Constraint partitioning leads to much relaxed subproblems that are significantly easier to solve. However, there exist global constraints relating multiple subproblems that must be resolved. Previous methods cannot exploit such structures using constraint partitioning because they cannot resolve inconsistent global constraints efficiently.
The Theory And Applications Of Discrete Constrained Optimization Using Lagrange Multipliers
, 2000
"... In this thesis, we present a new theory of discrete constrained optimization using Lagrange multipliers and an associated firstorder search procedure (DLM) to solve general constrained optimization problems in discrete, continuous and mixedinteger space. The constrained problems are general in the ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
In this thesis, we present a new theory of discrete constrained optimization using Lagrange multipliers and an associated firstorder search procedure (DLM) to solve general constrained optimization problems in discrete, continuous and mixedinteger space. The constrained problems are general in the sense that they do not assume the differentiability or convexity of functions. Our proposed theory and methods are targeted at discrete problems and can be extended to continuous and mixedinteger problems by coding continuous variables using a floatingpoint representation (discretization). We have characterized the errors incurred due to such discretization and have proved that there exists upper bounds on the errors. Hence, continuous and mixedinteger constrained problems, as well as discrete ones, can be handled by DLM in a unified way with bounded errors.