Results 1 
9 of
9
On Convergence Rates in the Central Limit Theorems for Combinatorial Structures
, 1998
"... Flajolet and Soria established several central limit theorems for the parameter "number of components" in a wide class of combinatorial structures. In this paper, we shall prove a simple theorem which applies to characterize the convergence rates in their central limit theorems. This theorem is a ..."
Abstract

Cited by 67 (8 self)
 Add to MetaCart
Flajolet and Soria established several central limit theorems for the parameter "number of components" in a wide class of combinatorial structures. In this paper, we shall prove a simple theorem which applies to characterize the convergence rates in their central limit theorems. This theorem is also applicable to arithmetical functions. Moreover, asymptotic expressions are derived for moments of integral order. Many examples from different applications are discussed.
On the Performance of WEAKHEAPSORT
, 2000
"... . Dutton #1993# presents a further HEAPSORT variant called WEAKHEAPSORT, which also contains a new data structure for priority queues. The sorting algorithm and the underlying data structure are analyzed showing that WEAKHEAPSORT is the best HEAPSORT variant and that it has a lot of nice propert ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
. Dutton #1993# presents a further HEAPSORT variant called WEAKHEAPSORT, which also contains a new data structure for priority queues. The sorting algorithm and the underlying data structure are analyzed showing that WEAKHEAPSORT is the best HEAPSORT variant and that it has a lot of nice properties. It is shown that the worst case number of comparisons is ndlog ne# 2 dlog ne + n #dlog ne#nlog n +0:1nand weak heaps can be generated with n # 1 comparisons. A doubleended priority queue based on weakheaps can be generated in n + dn=2e#2 comparisons. Moreover, examples for the worst and the best case of WEAKHEAPSORT are presented, the number of WeakHeaps on f1;:::;ng is determined, and experiments on the average case are reported. 1
On the Number of Heaps and the Cost of Heap Construction
, 2001
"... Heaps constitute a wellknown data structure allowing the implementation of an e#cient O(n log n) sorting algorithm as well as the design of fast priority queues. Although heaps have been known for long, their combinatorial properties are still partially worked out: exact summation formulae have be ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Heaps constitute a wellknown data structure allowing the implementation of an e#cient O(n log n) sorting algorithm as well as the design of fast priority queues. Although heaps have been known for long, their combinatorial properties are still partially worked out: exact summation formulae have been stated, but most of the asymptotic behaviors are still unknown. In this paper, we present a number of general (not restricting to special subsequences) asymptotic results that give insight on the di#culties encountered in the asymptotic study of the number of heaps of a given size and of the cost of heap construction. In particular we exhibit the influence of arithmetic functions in the apparently chaotic behavior of these quantities. It is also shown that the distribution function of the cost of heap construction using Floyd's algorithm and other variants is asymptotically normal. 1
Implementing HEAPSORT with n log n  0.9n and QUICKSORT with n log n + 0.2n Comparisons
 ACM Journal of Experimental Algorithms
, 2002
"... With refinements to the WEAKHEAPSORT... ..."
Pushing the Limits in Sequential Sorting
 Proceedings of the 4 th International Workshop on Algorithm Engineering (WAE 2000
, 2000
"... With refinements to the WEAKHEAPSORT algorithm we establish the general and practical relevant sequential sorting algorithm RELAXEDWEAKHEAPSORT executing exactly ndlog ne#2 dlog ne + 1 # n log n # 0:9n comparisons on any given input. The number of transpositions is bounded by n plus the number of ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
With refinements to the WEAKHEAPSORT algorithm we establish the general and practical relevant sequential sorting algorithm RELAXEDWEAKHEAPSORT executing exactly ndlog ne#2 dlog ne + 1 # n log n # 0:9n comparisons on any given input. The number of transpositions is bounded by n plus the number of comparisons. Experiments show that RELAXEDWEAKHEAPSORT only requires O(n) extra bits. Even if this space is not available, with QUICKWEAKHEAPSORT we propose an efficient QUICKSORT variant with n log n+0:2n+ o(n) comparisons on the average. Furthermore, we present data showing that WEAKHEAPSORT, RELAXEDWEAKHEAPSORT and QUICKWEAKHEAPSORT beat other performant QUICKSORT and HEAPSORT variants even for moderate values of n.
A New Data Structure for Heapsort with Improved Number of Comparisons (Extended Abstract)
"... Abstract. In this paper we present a new data structure for implementing heapsort algorithm for pairs of which can be simultaneously stored and processed in a single register. Since time complexity of Carlsson type variants of heapsort has already achieved a leading coefficient of 1, concretely nlg ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. In this paper we present a new data structure for implementing heapsort algorithm for pairs of which can be simultaneously stored and processed in a single register. Since time complexity of Carlsson type variants of heapsort has already achieved a leading coefficient of 1, concretely nlg n + nlg lg n, and lower bound theory asserts that no comparison based inplace sorting algorithm can sort n data in less than ⌈lg(n!) ⌉ ≈ n lg n − 1.44n comparisons on the average, any improvement in the number of comparisons can only be achieved in lower terms. Our new data structure results in improvement in the linear term of the time complexity function irrespective of the variant of the heapsort algorithm used. This improvement is important in the context that some of the variants of heapsort algorithm, for example weak heapsort although not inplace, are near optimal and is away from the theoretical bound on number of comparisons by only 1.54n.
Comparative Performance Study of Improved Heap Sort Algorithm on Different Hardware 1
"... Abstract: Problem statement: Several efficient algorithms were developed to cope with the popular task of sorting. Improved heap sort is a new variant of heap sort. Basic idea of new algorithm is similar to classical Heap sort algorithm but it builds heap in another way. The improved heap sort algor ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract: Problem statement: Several efficient algorithms were developed to cope with the popular task of sorting. Improved heap sort is a new variant of heap sort. Basic idea of new algorithm is similar to classical Heap sort algorithm but it builds heap in another way. The improved heap sort algorithm requires nlogn0.788928n comparisons for worst case and nlognn comparisons in average case. This algorithm uses only one comparison at each node. Hardware has impact on performance of an algorithm. Since improved heap sort is a new algorithm, its performance on different hardware is required to be measured. Approach: In this comparative study the mathematical results of improved heap sort were verified experimentally on different hardware. To have some experimental data to sustain this comparison five representative hardware were chosen and code was executed and execution time was noted to verify and analyze the performance. Results: Hardware impact was shown on the performance of improved heap sort algorithm. Performance of algorithm varied for different datasets also. Conclusion: The Improved Heap sort algorithm performance was found better as compared to traditional heap sort on different hardware, but on certain hardware it was found best. Key words: Complexity, performance of algorithms, sorting
Elementary average case analysis of Floyd's algorithms to construct heaps
"... We reanalyse the average number of comparisons and assignments made during a heap construction by two Floyd's algorithms: the original siftup algorithm and its more ecient version. These gures are derived from the average number of comparisons and assignments made on the path from the root of the he ..."
Abstract
 Add to MetaCart
We reanalyse the average number of comparisons and assignments made during a heap construction by two Floyd's algorithms: the original siftup algorithm and its more ecient version. These gures are derived from the average number of comparisons and assignments made on the path from the root of the heap to the nal place of the root element, i.e., during one execution of the algorithms. We have two objectives in the analysis: First, we show that the analysis can be done with elementary calculations, involving only simple recursions and sums, being therefore accessible to wider audience. Second, these techniques give precise answers which are slightly stronger than the existing ones.
Analysis of Modified Heap Sort Algorithm on Different Environment
"... sorting algorithm is an algorithm that puts elements of a list in a certain order i.e. ascending or descending. Sorting is perhaps the most widely studied problem in computer science and is frequently used as a benchmark of a system’s performance. This paper presented the comparative performance stu ..."
Abstract
 Add to MetaCart
sorting algorithm is an algorithm that puts elements of a list in a certain order i.e. ascending or descending. Sorting is perhaps the most widely studied problem in computer science and is frequently used as a benchmark of a system’s performance. This paper presented the comparative performance study of four sorting algorithms on different platform. For each machine, it is found that the algorithm depends upon the number of elements to be sorted. In addition, as expected, results show that the relative performance of the algorithms differed on the various machines. So, algorithm performance is dependent on data size and there exists impact of hardware also. S Keywords—Algorithm, Analysis, Complexity, Sorting.