Results 1  10
of
22
Exploring Very Large State Spaces Using Genetic Algorithms
 SOFTWARE TOOLS FOR TECHNOLOGY TRANSFER
"... We present a novel framework for exploring very large state spaces of concurrent reactive systems. Our framework exploits applicationindependent heuristics using genetic algorithms to guide a statespace search towards error states. We have implemented this framework in conjunction with VeriSoft, ..."
Abstract

Cited by 49 (2 self)
 Add to MetaCart
We present a novel framework for exploring very large state spaces of concurrent reactive systems. Our framework exploits applicationindependent heuristics using genetic algorithms to guide a statespace search towards error states. We have implemented this framework in conjunction with VeriSoft, a tool for exploring the state spaces of software applications composed of several concurrent processes executing arbitrary code. We present experimental results obtained with several examples of programs, including a C implementation of a publickey authentication protocol. We discuss heuristics and properties of state spaces that help a genetic search detect deadlocks and assertion violations. For nding errors in very large state spaces, our experiments show that a genetic search using simple heuristics can significantly outperform random and systematic searches.
A Comparison of Genetic Programming Variants for Data Classification
, 1999
"... In this paper we report the results of a comparative study on different variations of genetic programming applied on binary data classification problems. The first genetic programming variant is weighting data records for calculating the classification error and modifying the weights during the run. ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
In this paper we report the results of a comparative study on different variations of genetic programming applied on binary data classification problems. The first genetic programming variant is weighting data records for calculating the classification error and modifying the weights during the run. Hereby the algorithm is defining its own fitness function in an online fashion giving higher weights to `hard' records. Another novel feature we study is the atomic representation, where `Booleanization' of data is not performed at the root, but at the leafs of the trees and only Boolean functions are used in the trees' body. As a third aspect we look at generational and steadystate models in combination of both features.
Solving Constraint Satisfaction Problems with Heuristicbased Evolutionary Algorithms
 In Proceedings of the 2000 Congress on Evolutionary Computation
, 1999
"... Evolutionary algorithms (EAs) for solving constraint satisfaction problems (CSPs) can be roughly divided into two classes: EAs using adaptive fitness functions and EAs using heuristics. In [5] the most effective EAs of the first class have been compared experimentally using a large set of benchma ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
Evolutionary algorithms (EAs) for solving constraint satisfaction problems (CSPs) can be roughly divided into two classes: EAs using adaptive fitness functions and EAs using heuristics. In [5] the most effective EAs of the first class have been compared experimentally using a large set of benchmark instances consisting of randomly generated binary CSPs. In this paper we complete this comparison by studying the most effective EAs that use heuristics.
SAWing EAs: adapting the fitness function for solving constrained problems
, 1999
"... In this chapter we describe a problem independent method for treating constraints in an evolutionary algorithm. Technically, this method amounts to changing the definition of the fitness function during a run of an EA, based on feedback from the search process. Obviously, redefining the fitness func ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
In this chapter we describe a problem independent method for treating constraints in an evolutionary algorithm. Technically, this method amounts to changing the definition of the fitness function during a run of an EA, based on feedback from the search process. Obviously, redefining the fitness function means rede ning the problem to be solved. On the short term this deceives the algorithm making the fitness values deteriorate, but as experiments clearly indicate, on the long run it is beneficial. We illustrate the power of the method on different constraint satisfaction problems and point out other application areas of this technique.
Evolving Problems to Learn About Particle Swarm Optimizers and . . .
 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
, 2007
"... We use evolutionary computation (EC) to automatically find problems which demonstrate the strength and weaknesses of modern search heuristics. In particular, we analyze particle swarm optimization (PSO), differential evolution (DE), and covariance matrix adaptationevolution strategy (CMAES). Each ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
We use evolutionary computation (EC) to automatically find problems which demonstrate the strength and weaknesses of modern search heuristics. In particular, we analyze particle swarm optimization (PSO), differential evolution (DE), and covariance matrix adaptationevolution strategy (CMAES). Each evolutionary algorithm is contrasted with the others and with a robust nonstochastic gradient follower (i.e., a hill climber) based on Newton–Raphson. The evolved benchmark problems yield insights into the operation of PSOs, illustrate benefits and drawbacks of different population sizes, velocity limits, and constriction (friction) coefficients. The fitness landscapes made by genetic programming reveal new swarm phenomena, such as deception, thereby explaining how they work and allowing us to devise better extended particle swarm systems. The method could be applied to any type of optimizer.
Evolutionary Algorithms and Constraint Satisfaction: Definitions, Survey, Methodology, and Research Directions
 Theoretical Aspects of Evolutionary Computing
, 2001
"... In this tutorial we consider the issue of constraint handling by evolutionary algorithms (EA). We start this study with a categorization of constrained problems and observe that constraint handling is not straightforward in an EA. Namely, the search operators mutation and recombination are `blind' t ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
In this tutorial we consider the issue of constraint handling by evolutionary algorithms (EA). We start this study with a categorization of constrained problems and observe that constraint handling is not straightforward in an EA. Namely, the search operators mutation and recombination are `blind' to constraints. Hence, there is no guarantee that if the parents satisfy some constraints the offspring will satisfy them as well. This suggests that the presence of constraints in a problem makes EAs intrinsically unsuited to solve this problem. This should especially hold if there are no objectives only constraints in the original problem specification  the category of constraint satisfaction problems. A survey of related literature, however, discloses that there are quite a few successful attempts to evolutionary constraint satisfaction. Based on this survey we identify a number of common features in these approaches and arrive to the conclusion that the presence of constraints is not harmful, but rather helpful in that it provides extra information that EAs can utilize. The tutorial is concluded by considering a number of key questions on research methodology and some promising future research directions.
A Genetic Local Search Algorithm for Random Binary Constraint Satisfaction Problems
 In Proceedings of the ACM Symposium on Applied Computing
, 2000
"... This paper introduces a genetic local search algorithm for binary constraint satisfaction problems. The core of the algorithm consists of an adhoc optimization procedure followed by the application of blind genetic operators. A standard set of benchmark instances is used in order to assess the perf ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
This paper introduces a genetic local search algorithm for binary constraint satisfaction problems. The core of the algorithm consists of an adhoc optimization procedure followed by the application of blind genetic operators. A standard set of benchmark instances is used in order to assess the performance of the algorithm. The results indicate that this apparently naive hybridation of a genetic algorithm with local search yields a rather powerful heuristic algorithm for random binary constraint satisfaction problems. Categories and Subject Descriptors G.1.6 [Mathematics of Computing]: OptimizationGlobal Optimization; I.2.8 [Artificial Intelligence]: Problem Solving, Control Methods, and SearchHeuristic methods General Terms Algorithms, Experimentation 1. INTRODUCTION In the binary constraint satisfaction problem (BCSP) we are given a set of variables, where each variable has a domain of values, and a set of constraints acting between pairs of variables. The problem consi...
Comparing classical methods for solving binary constraint satisfaction problems with state of the art evolutionary computation
 Applications of Evolutionary Computing. Number 2279 in Springer Lecture Notes on Computer Science
, 2002
"... Abstract. Constraint Satisfaction Problems form a class of problems that are generally computationally difficult and have been addressed with many complete and heuristic algorithms. We present two complete algorithms, as well as two evolutionary algorithms, and compare them on randomly generated ins ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Abstract. Constraint Satisfaction Problems form a class of problems that are generally computationally difficult and have been addressed with many complete and heuristic algorithms. We present two complete algorithms, as well as two evolutionary algorithms, and compare them on randomly generated instances of binary constraint satisfaction problems. We find that the evolutionary algorithms are less effective than the classical techniques. 1
Adapting the Fitness Function in GP for Data Mining
 In Proceedings of the European Workshop on Genetic Programming, EuroGP’99. Poli R., Nordin P., Langdon W.B., (Eds.), Lecture Notes in Computer Science
, 1999
"... Abstract. In this paper we describe how the Stepwise Adaptation of Weights (saw) technique can be applied in genetic programming. The sawing mechanism has been originally developed for and successfully used in eas for constraint satisfaction problems. Here we identify the very basic underlying idea ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Abstract. In this paper we describe how the Stepwise Adaptation of Weights (saw) technique can be applied in genetic programming. The sawing mechanism has been originally developed for and successfully used in eas for constraint satisfaction problems. Here we identify the very basic underlying ideas behind sawing and point out how it can be used for different types of problems. In particular, sawing is wellsuited for data mining tasks where the fitness of a candidate solution is composed by ‘local scores ’ on data records. We evaluate the power of the sawing mechanism on a number of benchmark classification data sets. The results indicate that extending the gp with the sawing feature increases its performance when different types of misclassifications are not weighted differently, but leads to worse results when they are. 1
Transition models as an incremental approach for problem solving in evolutionary algorithms
 In GECCO ’05: Proceedings of the 2005 conference on Genetic and evolutionary computation
, 2005
"... This paper proposes an incremental approach for building solutions using evolutionary computation. It presents a simple evolutionary model called a Transition model in which partial solutions are constructed that interact to provide larger solutions. An evolutionary process is used to merge these pa ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
This paper proposes an incremental approach for building solutions using evolutionary computation. It presents a simple evolutionary model called a Transition model in which partial solutions are constructed that interact to provide larger solutions. An evolutionary process is used to merge these partial solutions into a full solution for the problem at hand. The paper provides a preliminary study on the evolutionary dynamics of this model as well as an empirical comparison with other evolutionary techniques on binary constraint satisfaction. Categories and Subject Descriptors