Results 1 
6 of
6
Unbiased Black Box Search Algorithms
"... We formalize the concept of an unbiased black box algorithm, which generalises the idea previously introduced by Lehre and Witt. Our formalization of bias relates to the symmetry group of the problem class under consideration, establishing a connection with previous work on No Free Lunch. Our defini ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
We formalize the concept of an unbiased black box algorithm, which generalises the idea previously introduced by Lehre and Witt. Our formalization of bias relates to the symmetry group of the problem class under consideration, establishing a connection with previous work on No Free Lunch. Our definition is motivated and justified by a series of results, including the outcome that given a biased algorithm, there exists a corresponding unbiased algorithm with the same expected behaviour (over the problem class) and equal or better worstcase performance. For the case of evolutionary algorithms, it is already known how to construct unbiased mutation and crossover operators, and we summarise those results.
Beyond No Free Lunch: Realistic Algorithms for Arbitrary Problem Classes
, 2010
"... In this paper we present a simple and general new No Free Lunchlike result that applies to revisiting algorithms searching arbitrary problem sets. We begin by unifying the assumptions of closure under permutation and nonrevisiting algorithms. We then propose a new approach to reasoning about sear ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
In this paper we present a simple and general new No Free Lunchlike result that applies to revisiting algorithms searching arbitrary problem sets. We begin by unifying the assumptions of closure under permutation and nonrevisiting algorithms. We then propose a new approach to reasoning about search algorithm performance, treating search algorithms as stochastic processes and thereby admitting revisiting; for this approach we need only make a simple assumption that search algorithms are applied for optimisation (i.e. maximisation or minimisation), rather than considering arbitrary performance measures. As a consequence of a proof that nonrevisiting enumeration has the best possible expected performance for arbitrary distributions over arbitrary problem sets, this allows us to show that better than enumeration performance by some algorithm on some problem set predicts nothing about performance on a second problem set when nothing is known about the relation between the two. Finally we note that enumeration typically has excellent time and space complexities. Our approach is both simple and powerful; our new No Free Lunchlike result is stronger than the originals both in its predictions and, apart from not applying to arbitrary performance measures but only to those reflecting the typical goal of search, weaker than the originals in its assumptions. It is also directly relevant to research into realworld search algorithms.
No Free Lunch Theorems: Limitations and Perspectives of Metaheuristics
"... [...] Thus not only our reason fails us in the discovery of the ultimate connexion of causes and effects, but even after experience has informed us of their constant conjunction, it is impossible for us to satisfy ourselves by our reason, why we should extend that experience beyond those particular ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
[...] Thus not only our reason fails us in the discovery of the ultimate connexion of causes and effects, but even after experience has informed us of their constant conjunction, it is impossible for us to satisfy ourselves by our reason, why we should extend that experience beyond those particular instances, which have fallen under our observation. We suppose, but are never able to prove, that there must be a resemblance betwixt those objects, of which we have had experience, and those which lie beyond the reach of our discovery. (David Hume, 1739 [14, 10]) Abstract The No Free Lunch (NFL) theorems for search and optimization are reviewed and their implications for the design of metaheuristics are discussed. The theorems state that any two search or optimization algorithms are equivalent when their performance is averaged across all possible problems and even over subsets of problems fulfilling certain constraints. The NFL results show that if there is no assumption regarding the relation between visited and unseen search points, efficient search and optimization is impossible. There is no well performing universal metaheuristic, but the heuristics must be tailored to the problem class at hand using prior knowledge. In practice, it is not likely that the preconditions of the NFL theorems are fulfilled for a problem class and thus differences between algorithms exist. Therefore, tailored algorithms can exploit structure underlying the optimization problem. Given full knowledge about the problem class, it is in theory possible to construct an optimal algorithm. 1
No Free Lunch and Benchmarks
"... We extend previous results concerning BlackBox search algorithms, presenting new theoretical tools related to No Free Lunch (NFL) where functions are restricted to some Benchmark (that need not be permutation closed), algorithms are restricted to some collection (that need not be permutation closed ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
We extend previous results concerning BlackBox search algorithms, presenting new theoretical tools related to No Free Lunch (NFL) where functions are restricted to some Benchmark (that need not be permutation closed), algorithms are restricted to some collection (that need not be permutation closed) or limited to some number of steps, or the performance measure is given. “Minimax distinctions ” are considered from a geometric perspective, and basic results on performance matching are also presented.
A MeasureTheoretic Analysis of Stochastic Optimization
"... This paper proposes a measuretheoretic framework to study iterative stochastic optimizers that provides theoretical tools to explore how the optimization methods may be improved. Within this framework, optimizers form a closed, convex subset of a normed vector space, implying the existence of a dis ..."
Abstract
 Add to MetaCart
(Show Context)
This paper proposes a measuretheoretic framework to study iterative stochastic optimizers that provides theoretical tools to explore how the optimization methods may be improved. Within this framework, optimizers form a closed, convex subset of a normed vector space, implying the existence of a distance metric between any two optimizers and a meaningful and computable spectrum of new optimizers between them. It is shown how the formalism applies to evolutionary algorithms in general. The analytic property of continuity is studied in the context of genetic algorithms, revealing the conditions under which approximations such as metamodeling or surrogate methods may be effective. These results demonstrate the power of the proposed analytic framework, which can be used to propose and analyze new techniques such as controlled convex combinations of optimizers, metaoptimization of algorithm parameters, and more.