Results 1 
8 of
8
Phase Change of Limit Laws in the Quicksort Recurrence Under Varying Toll Functions
, 2001
"... We characterize all limit laws of the quicksort type random variables defined recursively by Xn = X In + X # n1In + Tn when the "toll function" Tn varies and satisfies general conditions, where (Xn ), (X # n ), (I n , Tn ) are independent, Xn . . . , n 1}. When the "to ..."
Abstract

Cited by 52 (17 self)
 Add to MetaCart
We characterize all limit laws of the quicksort type random variables defined recursively by Xn = X In + X # n1In + Tn when the "toll function" Tn varies and satisfies general conditions, where (Xn ), (X # n ), (I n , Tn ) are independent, Xn . . . , n 1}. When the "toll function" Tn (cost needed to partition the original problem into smaller subproblems) is small (roughly lim sup n## log E(Tn )/ log n 1/2), Xn is asymptotically normally distributed; nonnormal limit laws emerge when Tn becomes larger. We give many new examples ranging from the number of exchanges in quicksort to sorting on broadcast communication model, from an insitu permutation algorithm to tree traversal algorithms, etc.
On a multivariate contraction method for random recursive structures with applications to Quicksort
, 2001
"... The contraction method for recursive algorithms is extended to the multivariate analysis of vectors of parameters of recursive structures and algorithms. We prove a general multivariate limit law which also leads to an approach to asymptotic covariances and correlations of the parameters. As an appl ..."
Abstract

Cited by 35 (17 self)
 Add to MetaCart
(Show Context)
The contraction method for recursive algorithms is extended to the multivariate analysis of vectors of parameters of recursive structures and algorithms. We prove a general multivariate limit law which also leads to an approach to asymptotic covariances and correlations of the parameters. As an application the asymptotic correlations and a bivariate limit law for the number of key comparisons and exchanges of medianof(2t + 1) Quicksort is given. Moreover, for the Quicksort programs analyzed by Sedgewick the exact order of the standard deviation and a limit law follow, considering all the parameters counted by Sedgewick.
Quickselect and Dickman function
 Combinatorics, Probability and Computing
, 2000
"... We show that the limiting distribution of the number of comparisons used by Hoare's quickselect algorithm when given a random permutation of n elements for finding the mth smallest element, where m = o(n), is the Dickman function. The limiting distribution of the number of exchanges is also de ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
(Show Context)
We show that the limiting distribution of the number of comparisons used by Hoare's quickselect algorithm when given a random permutation of n elements for finding the mth smallest element, where m = o(n), is the Dickman function. The limiting distribution of the number of exchanges is also derived. 1 Quickselect Quickselect is one of the simplest and e#cient algorithms in practice for finding specified order statistics in a given sequence. It was invented by Hoare [19] and uses the usual partitioning procedure of quicksort: choose first a partitioning key, say x; regroup the given sequence into two parts corresponding to elements whose values are less than and larger than x, respectively; then decide, according to the size of the smaller subgroup, which part to continue recursively or to stop if x is the desired order statistics; see Figure 1 for an illustration in terms of binary search trees. For more details, see Guibas [15] and Mahmoud [26]. This algorithm , although ine#cient in the worst case, has linear mean when given a sequence of n independent and identically distributed continuous random variables, or equivalently, when given a random permutation of n elements, where, here and throughout this paper, all n! permutations are equally likely. Let C n,m denote the number of comparisons used by quickselect for finding the mth smallest element in a random permutation, where the first partitioning stage uses n 1 comparisons. Knuth [23] was the first to show, by some di#erencing argument, that E(C n,m ) = 2 (n + 3 + (n + 1)H n (m + 2)Hm (n + 3 m)H n+1m ) , n, where Hm = 1#k#m k 1 . A more transparent asymptotic approximation is E(C n,m ) (#), (#) := 2 #), # Part of the work of this author was done while he was visiting School of C...
The Wiener index of random trees
, 2001
"... The Wiener index is analyzed for random recursive trees and random binary search trees in the uniform probabilistic models. We obtain the expectations, asymptotics for the variances, and limit laws for this parameter. The limit distributions are characterized as the projections of bivariate measures ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
The Wiener index is analyzed for random recursive trees and random binary search trees in the uniform probabilistic models. We obtain the expectations, asymptotics for the variances, and limit laws for this parameter. The limit distributions are characterized as the projections of bivariate measures that satisfy certain fixedpoint equations. Covariances, asymptotic correlations, and bivariate limit laws for the Wiener index and the internal path length are given.
Multivariate Aspects of the Contraction Method
, 2003
"... We survey multivariate limit theorems in the framework of the contraction method for recursive sequences as arising in the analysis of algorithms, random trees or branching processes. We compare and improve various general conditions under which limit laws can be obtained, state related open proble ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
We survey multivariate limit theorems in the framework of the contraction method for recursive sequences as arising in the analysis of algorithms, random trees or branching processes. We compare and improve various general conditions under which limit laws can be obtained, state related open problems and give applications to the analysis of algorithms and branching recurrences.
On Smoothed Analysis of Quicksort and Hoare’s Find
"... We provide a smoothed analysis of Hoare’s find algorithm and we revisit the smoothed analysis of quicksort. Hoare’s find algorithm – often called quickselect – is an easytoimplement algorithm for finding the kth smallest element of a sequence. While the worstcase number of comparisons that Hoar ..."
Abstract
 Add to MetaCart
(Show Context)
We provide a smoothed analysis of Hoare’s find algorithm and we revisit the smoothed analysis of quicksort. Hoare’s find algorithm – often called quickselect – is an easytoimplement algorithm for finding the kth smallest element of a sequence. While the worstcase number of comparisons that Hoare’s find needs is Θ(n 2), the averagecase number is Θ(n). We analyze what happens between these two extremes by providing a smoothed analysis of the algorithm in terms of two different perturbation models: additive noise and partial permutations. In the first model, an adversary specifies a sequence of n numbers of [0, 1], and then each number is perturbed by adding a random number drawn from the interval [0, d]. We prove that Hoare’s find needs Θ ( n p n/d+n) d+1 comparisons in expectation if the adversary may also specify the element that we would like to find. Furthermore, we show that Hoare’s find needs fewer comparisons for finding the median. In the second model, each element is marked with probability p and then a random permutation is applied to the marked elements. We prove that the expected number of comparisons to find the median is in Ω ` (1 − p) n p log n ´ , which is again tight. Finally, we provide lower bounds for the smoothed number of comparisons of quicksort and Hoare’s find for the medianofthree pivot rule, which usually yields faster algorithms than always selecting the first element: The pivot is the median of the first, middle, and last element of the sequence. We show that medianofthree does not yield a significant improvement over the classic rule: the lower bounds for the classic rule carry over to medianofthree.