Results 1  10
of
13
Global Optimization For Constrained Nonlinear Programming
, 2001
"... In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM dn ) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM dn ) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary and sufficient condition for constrained local minima (CLM dn ) in the theory of discrete constrained optimization using Lagrange multipliers developed in our group. The theory proves the equivalence between the set of discrete saddle points and the set of CLM dn, leading to the firstorder necessary and sufficient condition for CLM dn. To find
Optimal Anytime Search For Constrained Nonlinear Programming
, 2001
"... In this thesis, we study optimal anytime stochastic search algorithms (SSAs) for solving general constrained nonlinear programming problems (NLPs) in discrete, continuous and mixedinteger space. The algorithms are general in the sense that they do not assume di#erentiability or convexity of functio ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
In this thesis, we study optimal anytime stochastic search algorithms (SSAs) for solving general constrained nonlinear programming problems (NLPs) in discrete, continuous and mixedinteger space. The algorithms are general in the sense that they do not assume di#erentiability or convexity of functions. Based on the search algorithms, we develop the theory of SSAs and propose optimal SSAs with iterative deepening in order to minimize their expected search time. Based on the optimal SSAs, we then develop optimal anytime SSAs that generate improved solutions as more search time is allowed. Our SSAs
Varying Fitness Functions in Genetic Algorithms: Studying the Rate of Increase of the Dynamic Penalty Terms
 Parallel Problem Solving from Nature VPPSN V
, 1998
"... In this paper we present a promising technique that enhances the efficiency of GAs, when they are applied to constrained optimisation problems. ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
In this paper we present a promising technique that enhances the efficiency of GAs, when they are applied to constrained optimisation problems.
Using of a Costbased Unit Commitment Algorithm to Assist Bidding Strategy Decisions
 Proceedings IEEE 2003 Powerteck Bologna Conference, volume Paper n. 547
, 2003
"... Abstract—The paper describes a procedure developed to assist a generating company in choosing the most convenient bidding strategies for a dayahead electricity energy market. According to the proposed method, the profit maximization problem is transformed into a minimization problem that can be sol ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
Abstract—The paper describes a procedure developed to assist a generating company in choosing the most convenient bidding strategies for a dayahead electricity energy market. According to the proposed method, the profit maximization problem is transformed into a minimization problem that can be solved by a traditional hydrothermal unit commitment program after implementing a few modifications. The paper describes the modifications introduced in a unit commitment program based on the Lagrangian relaxation approach and on a disaggregated Bundle method for the solution of the dual problem. It also presents some results obtained for a realistic data set of hydrothermal power plants. The results are discussed in order to emphasize how the method can be applied to assess the bidding strategy choice of a given company. Index Terms—Unit commitment, electricity market, bidding strategies. I. NOMENCLATURE I, I ’ set of indexes of available thermal units in the system and those belonging to the company, respectively ( I: number of thermal units; i: thermal unit index). H, H ’ set of indexes of available hydro units in the system and those belonging to the company, respectively ( H: number of hydro units; h: hydro unit index). T set of time periods in the optimization horizon ( T: number of time periods; t: time period index). D Tdimensional vector of load demands Dt in each period t. u Irows Tcolumns matrix, whose rows are the Tdimensional arrays ui of the 01 variables ui,t indicating the commitment state of thermal unit i during period t. pI Irows Tcolumns matrix, whose rows are the Tdimensional arrays pi of production levels pi,t of thermal unit i during each period t.
The Theory And Applications Of Discrete Constrained Optimization Using Lagrange Multipliers
, 2000
"... In this thesis, we present a new theory of discrete constrained optimization using Lagrange multipliers and an associated firstorder search procedure (DLM) to solve general constrained optimization problems in discrete, continuous and mixedinteger space. The constrained problems are general in the ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
In this thesis, we present a new theory of discrete constrained optimization using Lagrange multipliers and an associated firstorder search procedure (DLM) to solve general constrained optimization problems in discrete, continuous and mixedinteger space. The constrained problems are general in the sense that they do not assume the differentiability or convexity of functions. Our proposed theory and methods are targeted at discrete problems and can be extended to continuous and mixedinteger problems by coding continuous variables using a floatingpoint representation (discretization). We have characterized the errors incurred due to such discretization and have proved that there exists upper bounds on the errors. Hence, continuous and mixedinteger constrained problems, as well as discrete ones, can be handled by DLM in a unified way with bounded errors.
Implementation of a Decoupled Optimization Technique for Design of Switching Regulators Using Genetic Algorithms
"... Abstract—This paper presents an implementation of a decoupled optimization technique for design of switching regulators using genetic algorithms (GAs). The optimization process entails the selection of component values in a switching regulator, in order to meet the static and dynamic requirements. A ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract—This paper presents an implementation of a decoupled optimization technique for design of switching regulators using genetic algorithms (GAs). The optimization process entails the selection of component values in a switching regulator, in order to meet the static and dynamic requirements. Although the proposed method inherits characteristics of evolutionary computations that involve randomness, recombination, and survival of the fittest, it does not perform a wholecircuit optimization. Thus, intensive computations that are usually found in stochastic optimization techniques can be avoided. Similar to many design approaches for power electronics circuits, a regulator is decoupled into two components, namely the power conversion stage (PCS) and the feedback network (FN). The PCS is optimized with the required static characteristics, whilst the FN is optimized with the required static and dynamic behaviors of the whole system. Systematic optimization procedures will be described and the technique is illustrated with the design of a buck regulator with overcurrent protection. The predicted results are compared with the published results available in the literature and are verified with experimental measurements. Index Terms—Circuit optimization, circuit simulation, computeraided design, genetic algorithms, power electronics. I.
Solving University Timetabling Problems Using Advanced Genetic Algorithms
 5TH INTERNATIONAL CONFERENCE ON TECHNOLOGY AND AUTOMATION, OCTOBER 1516, 2005, THESSALONIKI, GREECE, PROCEEDINGS OF THE ICTA’05
"... Timetabling problems together with scheduling ones constitute a class of difficult to solve combinatorial optimization problems that lack analytical solution methods. As such, these problems have attracted researchers from a number of disciplines, like Operations Research and Artificial Intelligenc ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Timetabling problems together with scheduling ones constitute a class of difficult to solve combinatorial optimization problems that lack analytical solution methods. As such, these problems have attracted researchers from a number of disciplines, like Operations Research and Artificial Intelligence, who have proposed a number of methods for solving them. In this paper we present a method based on Genetic Algorithms (GAs), to solve university course timetabling problems. This method incorporates GAs using an indirect representation based on event priorities, MicroGAs and heuristic local search operators in order to tackle a real world timetabling problem. The problem on which the method is applied and tested is a real case and comes from a Technological Educational Institute of Greece. The GA solution is compared to the manmade one produced by the institute’s staff and the comparative results are discussed.
Improving Constrained Nonlinear Search Algorithms Through Constraint Relaxation
, 2001
"... In this thesis we study constraint relaxations of various nonlinear programming (NLP) algorithms in order to improve their performance. For both stochastic and deterministic algorithms, we study the relationship between the expected time to find a feasible solution and the constraint relaxation leve ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this thesis we study constraint relaxations of various nonlinear programming (NLP) algorithms in order to improve their performance. For both stochastic and deterministic algorithms, we study the relationship between the expected time to find a feasible solution and the constraint relaxation level, build an exponential model based on this relationship, and develop a constraint relaxation schedule in such a way that the total time spent to find a feasible solution for all the relaxation levels is of the same order of magnitude as the time spent for finding a solution of similar quality using the last relaxation level alone.
Quantum Genetic Optimization
, 2007
"... The complexity of the selection procedure of a genetic algorithm that requires reordering, if we restrict the class of the possible fitness functions to varying fitness functions, isO (N log N) where N is the size of the population. The Quantum Genetic Optimization Algorithm (QGOA) exploits the powe ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The complexity of the selection procedure of a genetic algorithm that requires reordering, if we restrict the class of the possible fitness functions to varying fitness functions, isO (N log N) where N is the size of the population. The Quantum Genetic Optimization Algorithm (QGOA) exploits the power of quantum computation in order to speed up genetic procedures. While the quantum and classical genetic algorithms use the same number of generations, the QGOA outperforms the classical one in identifying the highfitness subpopulation at each generation. In QGOA the classical fitness evaluation and selection procedures are replaced by a single quantum procedure. We show that the complexity of our QGOA isO (1) in terms of number of oracle calls in the selection procedure. Such theoretical results are confirmed by the simulations of the algorithm. Index Terms Evolutionary computing and genetic algorithms, quantum computation. I.
Advanced Methods for Evolutionary Optimisation
, 1998
"... In this paper we present two advanced methods for evolutionary optimisation. One method is based on Parallel Genetic Algorithms. It is called Cooperating Populations with Different Evolution Behaviours (CoPDEB), and allows each population to exhibit a different evolution behaviour. Results from two ..."
Abstract
 Add to MetaCart
In this paper we present two advanced methods for evolutionary optimisation. One method is based on Parallel Genetic Algorithms. It is called Cooperating Populations with Different Evolution Behaviours (CoPDEB), and allows each population to exhibit a different evolution behaviour. Results from two problems show the advantage of using different evolution behaviour on each population. The other method concerns application of GAs on constrained optimisation problems. It is called the Varying Fitness Function (VFF) method and implements a fitness function with varying penalty terms, added to the objective function for penalising infeasible solutions, in order to assist the GA to easily locate the area of the global optimum. Simulation results on two real world problems show that the VFF method outperforms the classic static fitness function implementations. 1. Introduction Using a serial Genetic Algorithm with a static quality function is a wise decision in a great number of optimisatio...