Results 11 
18 of
18
An Hybrid Optimization Technique Coupling Evolutionary And Local . . .
, 2005
"... Evolutionary Algorithms are robust and powerful global optimization techniques for solving large scale problems that have many local optima. However, they require high CPU times, and they are very poor in terms of convergence performance. On the other hand, local search algorithms can converge in ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Evolutionary Algorithms are robust and powerful global optimization techniques for solving large scale problems that have many local optima. However, they require high CPU times, and they are very poor in terms of convergence performance. On the other hand, local search algorithms can converge in a few iterations but lack a global perspective. The combination of global and local search procedures should offer the advantages of both optimization methods while offsetting their disadvantages. This paper proposes a new hybrid optimization technique that merges a Genetic Algorithm with a local search strategy based on the Interior Point method. The e#ciency of this hybrid approach is demonstrated by solving a constrained multiobjective mathematical testcase.
A Knowledge Representation for ConstraintSatisfaction Problems
, 1990
"... AbstractIn this paper we present a general representation for constraint satisfaction problems (CSP) and a framework for reasoning about their solution that unlike most constraintbased relaxation algorithms. stresses the need for a "natural " encoding of constraint knowledge and can faci ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
AbstractIn this paper we present a general representation for constraint satisfaction problems (CSP) and a framework for reasoning about their solution that unlike most constraintbased relaxation algorithms. stresses the need for a "natural " encoding of constraint knowledge and can facilitate making inferences for propagation, backtracking, and explanation. The representation consists oi two componenrs: a generateandtest problem solver which cclntains information about the problem variables, and a constraintdriven reasoner that manages a set of constraints, specified as arbitrarily complex Boolean expressions and represented in the form of a constraint network. This constraint network: incorporates control information (reflected in the syntax of the constraints) that is used for constaint propagaticn: contains dependency information that can be used for explanation and for dependencydirected backtracking; and is incremental in the sense that if the problem specification is modified, a new solution can be derived by modifying the existing solution. 1.
A Potential Reduction Method for a Class of Smooth Convex Programming Problems
, 1990
"... In this paper we propose a potential reduction method for smooth convex programming. It is assumed that the objective and constraint functions fulfil the socalled Relative Lipschitz Condition, with Lipschitz constant M ? 0. The great advantage of this method, above the existing pathfollowing metho ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
In this paper we propose a potential reduction method for smooth convex programming. It is assumed that the objective and constraint functions fulfil the socalled Relative Lipschitz Condition, with Lipschitz constant M ? 0. The great advantage of this method, above the existing pathfollowing methods, is that it allows linesearches. In our method we do linesearches along the Newton direction with respect to a strictly convex potential function if we are far away from the central path. If we are sufficiently close to this path we update a lower bound for the optimal value. We prove that the number of iterations required by the algorithm to converge to an ffloptimal solution is O((1 + M 2 ) p nj ln fflj) or O((1 + M 2 )nj ln fflj), dependent on the updating scheme for the lower bound.
NP and Mathematics  a computational complexity perspective
 Proc. of the ICM 06
"... “P versus N P – a gift to mathematics from Computer Science” ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
“P versus N P – a gift to mathematics from Computer Science”
Linear Programming
, 1998
"... Linear programming has been a fundamental topic in the development of the computational sciences. The subject has its origins in the early work of L.B.J. Fourier on solving systems of linear inequalities, dating back to the 1820's. More recently, a healthy competition between the simplex and inte ..."
Abstract
 Add to MetaCart
Linear programming has been a fundamental topic in the development of the computational sciences. The subject has its origins in the early work of L.B.J. Fourier on solving systems of linear inequalities, dating back to the 1820's. More recently, a healthy competition between the simplex and interior point methods has led to rapid improvements in the technologies of linear programming. This combined with remarkable advances in computing hardware and software have brought linear programming tools to the desktop, in a variety of application software for decision support. Linear programming has provided a fertile ground for the development of various algorithmic paradigms. Diverse topics such as symbolic computation, numerical analysis, computational complexity, computational geometry, combinatorial optimization, and randomized algorithms all have some linear programming connection. This chapter reviews this universal role played by linear programming in the science of algorith...
Technical Report Linear Programming 1
, 1997
"... Dedicated to George Dantzig on this the 50 th ..."