Results 1  10
of
29
An Analysis of the (µ+1) EA on Simple PseudoBoolean Functions (Extended Abstract)
"... Carsten Witt FB Informatik, LS 2 Univ. Dortmund 44221 Dortmund, Germany carsten.witt@cs.unidortmund.de Abstract. Evolutionary Algorithms (EAs) are successfully applied for optimization in discrete search spaces, but theory is still weak in particular for populationbased EAs. Here, a first r ..."
Abstract

Cited by 34 (9 self)
 Add to MetaCart
Carsten Witt FB Informatik, LS 2 Univ. Dortmund 44221 Dortmund, Germany carsten.witt@cs.unidortmund.de Abstract. Evolutionary Algorithms (EAs) are successfully applied for optimization in discrete search spaces, but theory is still weak in particular for populationbased EAs. Here, a first rigorous analysis of the (+1) EA on pseudoBoolean functions is presented. For three example functions wellknown from the analysis of the (1+1) EA, bounds on the expected runtime and success probability are derived. For two of these functions, upper and lower bounds on the expected runtime are tight, and the (+1) EA is never more e#cient than the (1+1) EA. Moreover, all lower bounds grow with . On a more complicated function, however, a small increase of provably decreases the expected runtime drastically.
Lower Bounds for Local Search by Quantum Arguments
"... The problem of finding a local minimum of a blackbox function is central for understanding local search as well as quantum adiabatic algorithms. For functions on the Boolean hypercube {0,1} n (, we show a lower bound of Ω 2 n/4) /n on the number of queries needed by a quantum computer to solve this ..."
Abstract

Cited by 33 (2 self)
 Add to MetaCart
The problem of finding a local minimum of a blackbox function is central for understanding local search as well as quantum adiabatic algorithms. For functions on the Boolean hypercube {0,1} n (, we show a lower bound of Ω 2 n/4) /n on the number of queries needed by a quantum computer to solve this problem. More surprisingly, our approach, based on Ambainis’s quantum ( adversary method, also yields a lower bound of Ω 2 n/2 /n 2 on the problem’s classical randomized query complexity. This improves and simplifies a 1983 result of Aldous. Finally, in both the randomized and quantum cases, we give the first nontrivial lower bounds for finding local minima on grids of constant dimension d ≥ 3. 1.
Theoretical runtime analyses of search algorithms on the test data generation for the triangle classification problem
, 2008
"... Software Testing plays an important role in the life cycle of software development. Because software testing is very costly and tedious, many techniques have been proposed to automate it. One technique that has achieved good results is the use of Search Algorithms. Because most previous work on sear ..."
Abstract

Cited by 12 (9 self)
 Add to MetaCart
Software Testing plays an important role in the life cycle of software development. Because software testing is very costly and tedious, many techniques have been proposed to automate it. One technique that has achieved good results is the use of Search Algorithms. Because most previous work on search algorithms has been of an empirical nature, there is a need for theoretical results that confirm the feasibility of search algorithms applied to software testing. Such theoretical results might shed light on the limitations and benefits of search algorithms applied in this context. In this paper, we formally analyse the expected runtime of three different search algorithms on the problem of Test Data Generation for an instance of the Triangle Classification program. The search algorithms that we analyse are Random Search, Hill Climbing and Alternating Variable Method. We believe that this is a necessary first step that will lead and help the Software Engineering community to better understand the role of Search Based Techniques applied to software testing.
On the impact of objective function transformations on evolutionary and blackbox algorithms
 In Proc. of the Genetic and Evolutionary Computation Conference (GECCO
, 2005
"... Different fitness functions describe different problems. Hence, certain fitness transformations can lead to easier problems although they are still a model of the considered problem. In this paper, the class of neutral transformations for a simple rankbased evolutionary algorithm (EA) is described ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Different fitness functions describe different problems. Hence, certain fitness transformations can lead to easier problems although they are still a model of the considered problem. In this paper, the class of neutral transformations for a simple rankbased evolutionary algorithm (EA) is described completely, i.e., the class of functions that transfers easy problems for this EA in easy ones and difficult problems in difficult ones. Moreover, the class of neutral transformations for this populationbased EA is equal to the blackbox neutral transformations. Hence, it is a proper superset of the corresponding class for an EA based on fitnessproportional selection, but it is a proper subset of the class for random search. Furthermore, the minimal and maximal classes of neutral transformations are investigated in detail.
Randomized Search Heuristics as an Alternative to Exact Optimization
 LOGIC VERSUS APPROXIMATION, ESSAYS DEDICATED TO MICHAEL M. RICHTER ON THE OCCASION OF HIS 65TH BIRTHDAY, VOLUME 3075 OF LECTURE NOTES IN COMPUTER SCIENCE
, 2004
"... There are many alternatives to handle discrete optimization problems in applications. Problemspecific algorithms vs. heuristics, exact optimization vs. approximation vs. heuristic solutions, guaranteed run time vs. expected run time vs. experimental run time analysis. Here, a framework for a th ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
There are many alternatives to handle discrete optimization problems in applications. Problemspecific algorithms vs. heuristics, exact optimization vs. approximation vs. heuristic solutions, guaranteed run time vs. expected run time vs. experimental run time analysis. Here, a framework for a theory of randomized search heuristics is presented. After a brief
Towards a theory of randomized search heuristics
 OF LECTURE NOTES IN COMPUTER SCIENCE
, 2003
"... There is a welldeveloped theory about the algorithmic complexity of optimization problems. Complexity theory provides negative results which typically are based on assumptions like NP!=P or NP!=RP. Positive ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
There is a welldeveloped theory about the algorithmic complexity of optimization problems. Complexity theory provides negative results which typically are based on assumptions like NP!=P or NP!=RP. Positive
Full theoretical runtime analysis of alternating variable method on the triangle classification problem
 In International Symposium on Search Based Software Engineering (SSBSE
, 2009
"... Runtime Analysis is a type of theoretical investigation that aims to determine, via rigorous mathematical proofs, the time a search algorithm needs to find an optimal solution. This type of investigation is useful to understand why a search algorithm could be successful, and it gives insight of how ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Runtime Analysis is a type of theoretical investigation that aims to determine, via rigorous mathematical proofs, the time a search algorithm needs to find an optimal solution. This type of investigation is useful to understand why a search algorithm could be successful, and it gives insight of how search algorithms work. In previous work, we proved the runtimes of different search algorithms on the test data generation for the Triangle Classification (TC) problem. We theoretically proved that Alternating Variable Method (AVM) has the best performance on the coverage of the most difficult branch in our empirical study. In this paper, we prove that the runtime of AVM on all the branches of TC is O((log n) 2). That is necessary and sufficient to prove that AVM has a better runtime on TC compared to the other search algorithms we previously analysed. The theorems in this paper are useful for future analyses. In fact, to state that a search algorithm has worse runtime compared to AVM, it will be just sufficient to prove that its lower bound is higher than Ω((log n) 2) on the coverage of at least one branch of TC.
How Randomized Search Heuristics Find Maximum Cliques in Planar Graphs
, 2006
"... Surprisingly, general search heuristics often solve combinatorial problems quite sufficiently, although they do not outperform specialized algorithms. Here, the behavior of simple randomized optimizers on the maximum clique problem on planar graphs is investigated rigorously. The focus is on the wor ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Surprisingly, general search heuristics often solve combinatorial problems quite sufficiently, although they do not outperform specialized algorithms. Here, the behavior of simple randomized optimizers on the maximum clique problem on planar graphs is investigated rigorously. The focus is on the worst, average, and semiaveragecase behaviors. In semirandom planar graph models an adversary is allowed to modify moderately a random planar graph, where a graph is chosen uniformly at random among all planar graphs. With regard to the heuristics particular interest is given to the influences of the following four popular strategies to overcome local optima: local vs. globalsearch, single vs. multistart, small vs. large population, and elitism vs. nonelitism selection. Finally, the blackbox complexities of the planar graph models are analyzed.
Analysis of Computational Time of Simple Estimation of Distribution Algorithms
, 2010
"... Estimation of distribution algorithms (EDAs) are widely used in stochastic optimization. Impressive experimental results have been reported in the literature. However, little work has been done on analyzing the computation time of EDAs in relation to the problem size. It is still unclear how well ED ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Estimation of distribution algorithms (EDAs) are widely used in stochastic optimization. Impressive experimental results have been reported in the literature. However, little work has been done on analyzing the computation time of EDAs in relation to the problem size. It is still unclear how well EDAs (with a finite population size larger than two) will scale up when the dimension of the optimization problem (problem size) goes up. This paper studies the computational time complexity of a simple EDA, i.e., the univariate marginal distribution algorithm (UMDA), in order to gain more insight into EDAs complexity. First, we discuss how to measure the computational time complexity of EDAs. A classification of problem hardness based on our discussions is then given. Second, we prove a theorem related to problem hardness and the probability conditions of
www.stacsconf.org TIGHT BOUNDS FOR BLIND SEARCH ON THE INTEGERS
"... Abstract. We analyze a simple random process in which a token is moved in the interval A = {0,..., n}: Fix a probability distribution µ over {1,..., n}. Initially, the token is placed in a random position in A. In round t, a random value d is chosen according to µ. If the token is in position a ≥ d, ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. We analyze a simple random process in which a token is moved in the interval A = {0,..., n}: Fix a probability distribution µ over {1,..., n}. Initially, the token is placed in a random position in A. In round t, a random value d is chosen according to µ. If the token is in position a ≥ d, then it is moved to position a − d. Otherwise it stays put. Let T be the number of rounds until the token reaches position 0. We show tight bounds for the expectation of T for the optimal distribution µ. More precisely, we show that minµ{Eµ(T)} = Θ ` (log n) 2 ´. For the proof, a novel potential function argument is introduced. The research is motivated by the problem of approximating the minimum of a continuous function over [0, 1] with a “blind ” optimization strategy. 1.