Results 1  10
of
20
Theoretical aspects of evolutionary algorithms
 PROC. OF 28TH INT. COLLOQUIUM ON AUTOMATA, LANGUAGES AND PROGRAMMING (ICALP), LNCS 2076
, 2001
"... Randomized search heuristics like simulated annealing and evolutionary algorithms are applied successfully in many different situations. However, the theory on these algorithms is still in its infancy. Here it is discussed how and why such a theory should be developed. Afterwards, some fundamental r ..."
Abstract

Cited by 34 (16 self)
 Add to MetaCart
Randomized search heuristics like simulated annealing and evolutionary algorithms are applied successfully in many different situations. However, the theory on these algorithms is still in its infancy. Here it is discussed how and why such a theory should be developed. Afterwards, some fundamental results on evolutionary algorithms are presented in order to show how theoretical results on randomized search heuristics can be proved and how they contribute to the understanding of evolutionary algorithms.
Optimal ThroughputDelay Scaling in Wireless Networks  Part I: The Fluid Model
"... Gupta and Kumar (2000) introduced a random model to study throughput scaling in a wireless network with static nodes, and showed that the throughput per sourcedestination pair is Θ ( 1 / √ n log n). Grossglauser and Tse (2001) showed that when nodes are mobile it is possible to have a constant thr ..."
Abstract

Cited by 34 (0 self)
 Add to MetaCart
Gupta and Kumar (2000) introduced a random model to study throughput scaling in a wireless network with static nodes, and showed that the throughput per sourcedestination pair is Θ ( 1 / √ n log n). Grossglauser and Tse (2001) showed that when nodes are mobile it is possible to have a constant throughput scaling per sourcedestination pair. In most applications delay is also a key metric of network performance. It is expected that high throughput is achieved at the cost of high delay and that one can be improved at the cost of the other. The focus of this paper is on studying this tradeoff for wireless networks in a general framework. Optimal throughputdelay scaling laws for static and mobile wireless networks are established. For static networks, it is shown that the optimal throughputdelay tradeoff is given by D(n) = Θ(nT (n)), where T (n) and D(n) are the throughput and delay scaling, respectively. For mobile networks, a simple proof of the throughput scaling of Θ(1) for the GrossglauserTse scheme is given and the associated delay scaling is shown to be Θ(n log n). The optimal throughputdelay tradeoff for mobile networks is also established. To capture physical movement in the real world, a random walk model for node mobility is assumed. It is shown that for throughput of O ( 1 / √ n log n) , which can also be achieved in static networks, the throughputdelay tradeoff is the same as in static networks, i.e., D(n) = Θ(nT (n)). Surprisingly, for almost any throughput of a higher order, the delay is shown to be Θ(n log n), which is the delay for throughput of Θ(1). Our result, thus, suggests that the use of mobility to increase throughput, even slightly, in realworld networks would necessitate an abrupt and very large increase in delay.
Methods For The Analysis Of Evolutionary Algorithms On PseudoBoolean Functions
 IN
, 2000
"... Many experiments have shown that evolutionary algorithms are useful randomized search heuristics for optimization problems. In order to learn more about the reasons for their e#ciency and in order to obtain proven results on evolutionary algorithms it is necessary to develop a theory of evolutionary ..."
Abstract

Cited by 31 (0 self)
 Add to MetaCart
Many experiments have shown that evolutionary algorithms are useful randomized search heuristics for optimization problems. In order to learn more about the reasons for their e#ciency and in order to obtain proven results on evolutionary algorithms it is necessary to develop a theory of evolutionary algorithms. Such a theory is still in its infancy. A major part of a theory is the analysis of di#erent variants of evolutionary algorithms on selected functions. Several results of this kind have been obtained during the last years. Here important analytical tools are presented, discussed, and applied to wellchosen example functions.
A probabilistic language based upon sampling functions
 In Conference Record of the 32nd Annual ACM Symposium on Principles of Programming Languages
, 2005
"... As probabilistic computations play an increasing role in solving various problems, researchers have designed probabilistic languages which treat probability distributions as primitive datatypes. Most probabilistic languages, however, focus only on discrete distributions and have limited expressive p ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
As probabilistic computations play an increasing role in solving various problems, researchers have designed probabilistic languages which treat probability distributions as primitive datatypes. Most probabilistic languages, however, focus only on discrete distributions and have limited expressive power. This paper presents a probabilistic language, called λ○, whose expressive power is beyond discrete distributions. Rich expressiveness of λ ○ is due to its use of sampling functions, i.e., mappings from the unit interval (0.0, 1.0] to probability domains, in specifying probability distributions. As such, λ ○ enables programmers to formally express and reason about sampling methods developed in simulation theory. The use of λ ○ is demonstrated with three applications in robotics: robot localization, people tracking, and robotic mapping. All experiments have been carried out with real robots.
An efficient randomized algorithm for inputqueued switch scheduling
 in Proc. HOT Interconnects 9 Conf
, 2001
"... Many networking problems suffer from the socalled curse of dimensionality: That is, although excellent (even optimal) solutions exist for these problems, they do not scale well to high speeds or large systems. In various other situations where deterministic algorithms ’ scalability is poor, randomi ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
Many networking problems suffer from the socalled curse of dimensionality: That is, although excellent (even optimal) solutions exist for these problems, they do not scale well to high speeds or large systems. In various other situations where deterministic algorithms ’ scalability is poor, randomized versions of the same algorithms are easier to implement and provide surprisingly good performance. For example, recent work in load balancing 1,2 and for documenting replacement in Web caches 3 provides compelling demonstrations of the effectiveness of these randomized algorithms. Motwani and Raghavan provide other examples and a good introduction to the theory of randomized algorithms. 4 Here, we focus on applying randomization to the design of inputqueued (IQ) switch schedulers. We take for granted the effectiveness of the IQ architecture for very highspeed and for largesized switches. Several references attribute this effectiveness to the IQ architecture’s minimal memory bandwidth requirement compared with outputqueued and sharedmemory architectures. Figure 1 shows the logical structure of an N × N IQ packet switch. We assume the switch operates on fixedsize cells (or packets). Each input has N firstin firstout virtual output queues (VOQs), one for each output. This VOQ architecture avoids performance degradation from the headoftheline blocking phenomenon. 5 In each time slot, at most one cell arrives at each input and at most one cell can transfer to an output. When a cell with destination output j arrives at input i, the switch stores it in the VOQ, denoted Q ij. Let the average cell arrival rate at input i for output j be λ ij. Incoming traffic is admissible if Σ i=1 N λij < 1, ∀ j; and Σ j=1
Real royal road functions for constant population size
 In Proc. of GECCO 2003, Genetic and Evolutionary Computation Conference, no. 2724 in LNCS
, 2003
"... Evolutionary and genetic algorithms (EAs and GAs) are quite successful randomized function optimizers. This success is mainly based on the interaction of di#erent operators like selection, mutation, and crossover. Since this interaction is still not well understood, one is interested in the anal ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
Evolutionary and genetic algorithms (EAs and GAs) are quite successful randomized function optimizers. This success is mainly based on the interaction of di#erent operators like selection, mutation, and crossover. Since this interaction is still not well understood, one is interested in the analysis of the single operators. Jansen and Wegener (2001a) have described socalled real royal road functions where simple steadystate GAs have a polynomial expected optimization time while the success probability of mutationbased EAs is exponentially small even after an exponential number of steps. This success of the GA is based on the crossover operator and a population whose size is moderately increasing with the dimension of the search space. Here new real royal road functions are presented where crossover leads to a small optimization time, although the GA works with the smallest possible population size  namely 2.
Query Strategies for Priced Information
, 2002
"... this paper appeared in "Proceedings of the 32nd Annual ACM Symposium on Theory of Computing," Portland, OR, May 2000. 2 Current affiliation: Department of Computer Science, Princeton University, Princeton, NJ 08544. Most of this work was done while the author was at Stanford University and was visit ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
this paper appeared in "Proceedings of the 32nd Annual ACM Symposium on Theory of Computing," Portland, OR, May 2000. 2 Current affiliation: Department of Computer Science, Princeton University, Princeton, NJ 08544. Most of this work was done while the author was at Stanford University and was visiting IBM Almaden Research Center. Research at Stanford was supported by the Pierre and Christine Lamond Fellowship, NSF Grant IIS9811904.. and NSF Award CCR9357849, with matching funds from. IBM, Mitsubishi, Schlumberger Foundation, Shell Foundation, and Xerox Corporation. 3 Most of this work was done while the Ruthor was visiting the IBM Almaden Research Center. 4 Supported in part by a David and Lucre Packard Foundation Fellowship, an A/fred P. Sloan Research Fellowship, an ONR Young Investigator Award, and NSF Faculty Early Career Development Award CCR9701399
On Two Segmentation Problems
, 1999
"... this paper is organized as follows. In Section 2 we consider the hypercube segmentation problem, describe our randomized approximation algorithm, and present is derandomization. In Section 3 we present ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
this paper is organized as follows. In Section 2 we consider the hypercube segmentation problem, describe our randomized approximation algorithm, and present is derandomization. In Section 3 we present
On Asymptotically Optimal Methods of Prediction and Adaptive Coding for Markov Sources
 Journal of Complexity
, 2002
"... The problem of predicting a sequence x1,x2,... generated by a discrete source with unknown statistics is considered. Each letter xt+1 is predicted using information on the word x1x2 ···xt only. In fact, this problem is a classical problem which has received much attention. Its history can be traced ..."
Abstract

Cited by 10 (9 self)
 Add to MetaCart
The problem of predicting a sequence x1,x2,... generated by a discrete source with unknown statistics is considered. Each letter xt+1 is predicted using information on the word x1x2 ···xt only. In fact, this problem is a classical problem which has received much attention. Its history can be traced back to Laplace. To estimate the efficiency of a method of prediction, three quantities are considered: the precision as given by the Kullback–Leibler divergence, the memory size of the program needed to implement the method on a computer, and the time required, measured by the number of binary operations needed at each time instant. A method is presented for which the memory size and the average time are close to the minimum. The results can readily be translated to results about adaptive coding. © 2001 Elsevier Science (USA)
OnLine Paging against Adversarially Biased Random Inputs
 Journal of Algorithms
, 2002
"... In evaluating an algorithm, worstcase analysis can be overly pessimistic. ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
In evaluating an algorithm, worstcase analysis can be overly pessimistic.