Results 1  10
of
26
A general limit theorem for recursive algorithms and combinatorial structures
 ANN. APPL. PROB
, 2004
"... Limit laws are proven by the contraction method for random vectors of a recursive nature as they arise as parameters of combinatorial structures such as random trees or recursive algorithms, where we use the Zolotarev metric. In comparison to previous applications of this method, a general transfer ..."
Abstract

Cited by 53 (25 self)
 Add to MetaCart
Limit laws are proven by the contraction method for random vectors of a recursive nature as they arise as parameters of combinatorial structures such as random trees or recursive algorithms, where we use the Zolotarev metric. In comparison to previous applications of this method, a general transfer theorem is derived which allows us to establish a limit law on the basis of the recursive structure and to use the asymptotics of the first and second moments of the sequence. In particular, a general asymptotic normality result is obtained by this theorem which typically cannot be handled by the more common ℓ2 metrics. As applications we derive quite automatically many asymptotic limit results ranging from the size of tries or mary search trees and path lengths in digital structures to mergesort and parameters of random recursive trees, which were previously shown by different methods one by one. We also obtain a related local density approximation result as well as a global approximation result. For the proofs of these results we establish that a smoothed density distance as well as a smoothed total variation distance can be estimated from above by the Zolotarev metric, which is the main tool in this article.
Phase Change of Limit Laws in the Quicksort Recurrence Under Varying Toll Functions
, 2001
"... We characterize all limit laws of the quicksort type random variables defined recursively by Xn = X In + X # n1In + Tn when the "toll function" Tn varies and satisfies general conditions, where (Xn ), (X # n ), (I n , Tn ) are independent, Xn . . . , n 1}. When the "toll function" Tn ..."
Abstract

Cited by 44 (18 self)
 Add to MetaCart
We characterize all limit laws of the quicksort type random variables defined recursively by Xn = X In + X # n1In + Tn when the "toll function" Tn varies and satisfies general conditions, where (Xn ), (X # n ), (I n , Tn ) are independent, Xn . . . , n 1}. When the "toll function" Tn (cost needed to partition the original problem into smaller subproblems) is small (roughly lim sup n## log E(Tn )/ log n 1/2), Xn is asymptotically normally distributed; nonnormal limit laws emerge when Tn becomes larger. We give many new examples ranging from the number of exchanges in quicksort to sorting on broadcast communication model, from an insitu permutation algorithm to tree traversal algorithms, etc.
Optimal Sampling Strategies in Quicksort and Quickselect
 PROC. OF THE 25TH INTERNATIONAL COLLOQUIUM (ICALP98), VOLUME 1443 OF LNCS
, 1998
"... It is well known that the performance of quicksort can be substantially improved by selecting the median of a sample of three elements as the pivot of each partitioning stage. This variant is easily generalized to samples of size s = 2k + 1. For large samples the partitions are better as the median ..."
Abstract

Cited by 28 (4 self)
 Add to MetaCart
It is well known that the performance of quicksort can be substantially improved by selecting the median of a sample of three elements as the pivot of each partitioning stage. This variant is easily generalized to samples of size s = 2k + 1. For large samples the partitions are better as the median of the sample makes a more accurate estimate of the median of the array to be sorted, but the amount of additional comparisons and exchanges to find the median of the sample also increases. We show that the optimal sample size to minimize the average total cost of quicksort (which includes both comparisons and exchanges) is s = a \Delta p n + o( p n ). We also give a closed expression for the constant factor a, which depends on the medianfinding algorithm and the costs of elementary comparisons and exchanges. The result above holds in most situations, unless the cost of an exchange exceeds by far the cost of a comparison. In that particular case, it is better to select not the median of...
Quickselect and Dickman function
 Combinatorics, Probability and Computing
, 2000
"... We show that the limiting distribution of the number of comparisons used by Hoare's quickselect algorithm when given a random permutation of n elements for finding the mth smallest element, where m = o(n), is the Dickman function. The limiting distribution of the number of exchanges is also derived ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
We show that the limiting distribution of the number of comparisons used by Hoare's quickselect algorithm when given a random permutation of n elements for finding the mth smallest element, where m = o(n), is the Dickman function. The limiting distribution of the number of exchanges is also derived. 1 Quickselect Quickselect is one of the simplest and e#cient algorithms in practice for finding specified order statistics in a given sequence. It was invented by Hoare [19] and uses the usual partitioning procedure of quicksort: choose first a partitioning key, say x; regroup the given sequence into two parts corresponding to elements whose values are less than and larger than x, respectively; then decide, according to the size of the smaller subgroup, which part to continue recursively or to stop if x is the desired order statistics; see Figure 1 for an illustration in terms of binary search trees. For more details, see Guibas [15] and Mahmoud [26]. This algorithm , although ine#cient in the worst case, has linear mean when given a sequence of n independent and identically distributed continuous random variables, or equivalently, when given a random permutation of n elements, where, here and throughout this paper, all n! permutations are equally likely. Let C n,m denote the number of comparisons used by quickselect for finding the mth smallest element in a random permutation, where the first partitioning stage uses n 1 comparisons. Knuth [23] was the first to show, by some di#erencing argument, that E(C n,m ) = 2 (n + 3 + (n + 1)H n (m + 2)Hm (n + 3 m)H n+1m ) , n, where Hm = 1#k#m k 1 . A more transparent asymptotic approximation is E(C n,m ) (#), (#) := 2 #), # Part of the work of this author was done while he was visiting School of C...
On the probabilistic worstcase time of "FIND"
 ALGORITHMICA
, 2001
"... We analyze the worstcase number of comparisons Tn of Hoare’s selection algorithm find when the input is a random permutation, and worst case is measured with respect to the rank k. We give a new short proof that Tn/n tends to a limit distribution, and provide new bounds for the limiting distributi ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
We analyze the worstcase number of comparisons Tn of Hoare’s selection algorithm find when the input is a random permutation, and worst case is measured with respect to the rank k. We give a new short proof that Tn/n tends to a limit distribution, and provide new bounds for the limiting distribution.
Limit laws for partial match queries in quadtrees
 ANN. APPL. PROBAB
, 2001
"... It is proved that in an idealized uniform probabilistic model the cost of a partial match query in a multidimensional quadtree after normalization converges in distribution. The limiting distribution is given as a fixed point of a random affine operator. Also a firstorder asymptoticexpansion for th ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
It is proved that in an idealized uniform probabilistic model the cost of a partial match query in a multidimensional quadtree after normalization converges in distribution. The limiting distribution is given as a fixed point of a random affine operator. Also a firstorder asymptoticexpansion for the variance of the cost is derived and results on exponential moments are given. The analysis is based on the contraction method.
Density Approximation and Exact Simulation of Random Variables that are Solutions of FixedPoint Equations
 Adv. Appl. Probab
, 2002
"... An algorithm is developed for the exact simulation from distributions that are defined as fixedpoints of maps between spaces of probability measures. The fixedpoints of the class of maps under consideration include examples of limit distributions of random variables studied in the probabilistic an ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
An algorithm is developed for the exact simulation from distributions that are defined as fixedpoints of maps between spaces of probability measures. The fixedpoints of the class of maps under consideration include examples of limit distributions of random variables studied in the probabilistic analysis of algorithms. Approximating sequences for the densities of the fixedpoints with explicit error bounds are constructed. The sampling algorithm relies on a modified rejection method. AMS subject classifications. Primary: 65C10; secondary: 65C05, 68U20, 11K45.
Analysis of the expected number of bit comparisons required by Quickselect
 Algorithmica
"... When algorithms for sorting and searching are applied to keys that are represented as bit strings, we can quantify the performance of the algorithms not only in terms of the number of key comparisons required by the algorithms but also in terms of the number of bit comparisons. Some of the standard ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
When algorithms for sorting and searching are applied to keys that are represented as bit strings, we can quantify the performance of the algorithms not only in terms of the number of key comparisons required by the algorithms but also in terms of the number of bit comparisons. Some of the standard sorting and searching algorithms have been analyzed with respect to key comparisons but not with respect to bit comparisons. In this paper, we investigate the expected number of bit comparisons required by Quickselect (also known as Find). We develop exact and asymptotic formulae for the expected number of bit comparisons required to find the smallest or largest key by Quickselect and show that the expectation is asymptotically linear with respect to the number of keys. Similar results are obtained for the average case. For finding keys of arbitrary rank, we derive an exact formula for the expected number of bit comparisons that (using rational arithmetic) requires only finite summation (rather than such operations as numerical integration) and use it to compute the expectation for each target rank. AMS 2000 subject classifications. Primary 68W40; secondary 68P10, 60C05. Key words and phrases. Quickselect,Find, searching algorithms, asymptotics, averagecase analysis, key comparisons, bit comparisons.
Distributional convergence for the number of symbol comparisons used by QuickSort
, 2012
"... Most previous studies of the sorting algorithm QuickSort have used the number of key comparisons as a measure of the cost of executing the algorithm. Here we suppose that the n independent and identically distributed (iid) keys are each represented as a sequence of symbols from a probabilistic sourc ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Most previous studies of the sorting algorithm QuickSort have used the number of key comparisons as a measure of the cost of executing the algorithm. Here we suppose that the n independent and identically distributed (iid) keys are each represented as a sequence of symbols from a probabilistic source and that QuickSort operates on individual symbols, and we measure the execution cost as the number of symbol comparisons. Assuming only a mild “tameness ” condition on the source, we show that there is a limiting distribution for the number of symbol comparisons after normalization: first centering by the mean and then dividing by n. Additionally, under a condition that grows more restrictive as p increases, we have convergence of moments of orders p and smaller. In particular, we have convergence in distribution and convergence of moments of every order whenever the source is memoryless, i.e., whenever each key is generated as an infinite string of iid symbols. This is somewhat surprising: Even for the classical model that each key is an iid string of unbiased (“fair”) bits, the mean exhibits periodic fluctuations of order n.