Results 1 
6 of
6
A Lower Bound on the AverageCase Complexity of Shellsort
, 1999
"... We give a general lower bound on the averagecase complexity of Shellsort: the average number of datamovements (and comparisons) made by a ppass Shellsort for any incremental sequence is \Omega\Gamma pn 1+1=p ) for every p. The proof is an example of the use of Kolmogorov complexity (the incompr ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
We give a general lower bound on the averagecase complexity of Shellsort: the average number of datamovements (and comparisons) made by a ppass Shellsort for any incremental sequence is \Omega\Gamma pn 1+1=p ) for every p. The proof is an example of the use of Kolmogorov complexity (the incompressibility method) in the analysis of algorithms. 1 Introduction The question of a nontrivial general lower bound (or upper bound) on the average complexity of Shellsort (due to D.L. Shell [14]) has been open for about four decades [5, 13]. We present such a lower bound for ppass Shellsort for every p. Shellsort sorts a list of n elements in p passes using a sequence of increments h 1 ; : : : ; h p . In the kth pass the main list is divided in h k separate sublists of length dn=h k e, where the ith sublist consists of the elements at positions j, where j mod h k = i \Gamma 1, of the main list (i = 1; : : : ; h k ). Every sublist is sorted using a straightforward insertion sort. The effi...
On the Performance of WEAKHEAPSORT
, 2000
"... . Dutton #1993# presents a further HEAPSORT variant called WEAKHEAPSORT, which also contains a new data structure for priority queues. The sorting algorithm and the underlying data structure are analyzed showing that WEAKHEAPSORT is the best HEAPSORT variant and that it has a lot of nice propert ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
. Dutton #1993# presents a further HEAPSORT variant called WEAKHEAPSORT, which also contains a new data structure for priority queues. The sorting algorithm and the underlying data structure are analyzed showing that WEAKHEAPSORT is the best HEAPSORT variant and that it has a lot of nice properties. It is shown that the worst case number of comparisons is ndlog ne# 2 dlog ne + n #dlog ne#nlog n +0:1nand weak heaps can be generated with n # 1 comparisons. A doubleended priority queue based on weakheaps can be generated in n + dn=2e#2 comparisons. Moreover, examples for the worst and the best case of WEAKHEAPSORT are presented, the number of WeakHeaps on f1;:::;ng is determined, and experiments on the average case are reported. 1
Asymptotic Complexity from Experiments?  A Case Study for Randomized Algorithms
 IN PROCEEDINGS OF THE 4TH WORKSHOP OF ALGORITHMS AND ENGINEERING (WAE'00
, 2000
"... In the analysis of algorithms we are usually interested in obtaining closed form expressions for their complexity, or at least asymptotic expressions in O()notation. Unfortunately, there are fundamental reasons why we cannot obtain such expressions from experiments. This paper explains how we can ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
(Show Context)
In the analysis of algorithms we are usually interested in obtaining closed form expressions for their complexity, or at least asymptotic expressions in O()notation. Unfortunately, there are fundamental reasons why we cannot obtain such expressions from experiments. This paper explains how we can at least come close to this goal using the scientific method. Besides the traditional role of experiments as a source of preliminary ideas for theoretical analysis, experiments can test falsifiable hypotheses obtained by incomplete theoretical analysis. Asymptotic behavior can also be deduced from stronger hypotheses which have been induced from experiments. As long as a complete mathematical analysis is impossible, well tested hypotheses may have to take their place. Several
Enhanced Shell Sorting Algorithm
 Computer Journal of Enformatika
, 2007
"... Abstract—Many algorithms are available for sorting the unordered elements. Most important of them are Bubble sort, Heap sort, Insertion sort and Shell sort. These algorithms have their own pros and cons. Shell Sort which is an enhanced version of insertion sort, reduces the number of swaps of the el ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Many algorithms are available for sorting the unordered elements. Most important of them are Bubble sort, Heap sort, Insertion sort and Shell sort. These algorithms have their own pros and cons. Shell Sort which is an enhanced version of insertion sort, reduces the number of swaps of the elements being sorted to minimize the complexity and time as compared to insertion sort. Shell sort improves the efficiency of insertion sort by quickly shifting values to their destination. Average sort time is O(n 1.25), while worstcase time is O(n 1.5). It performs certain iterations. In each iteration it swaps some elements of the array in such a way that in last iteration when the value of h is one, the number of swaps will be reduced. Donald L. Shell invented a formula to calculate the value of ‘h’. this work focuses to identify some improvement in the conventional Shell sort algorithm. “Enhanced Shell Sort algorithm ” is an improvement in the algorithm to calculate the value of ‘h’. It has been observed that by applying this algorithm, number of swaps can be reduced up to 60 percent as compared to the existing algorithm. In some other cases this enhancement was found faster than the existing algorithms available. Keywords—Algorithm, Computation, Shell, Sorting. I.
Kolmogorov Complexity and a Triangle Problem of the Heilbronn Type
"... From among \Gamma n 3 \Delta triangles with vertices chosen from among n points in the unit square, U , let T be the one with the smallest area, and let A be the area of T . If the n points are chosen independently and at random (uniform distribution) then there exist positive c and C such tha ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
From among \Gamma n 3 \Delta triangles with vertices chosen from among n points in the unit square, U , let T be the one with the smallest area, and let A be the area of T . If the n points are chosen independently and at random (uniform distribution) then there exist positive c and C such that c=n 3 ! n ! C=n 3 for all large enough n, where n is the expectation of A. Moreover, with probability close to one c=n 3 ! A ! C=n 3 . Our proof uses the incompressibility method based on Kolmogorov complexity. The related Heilbronn problem asks for the maximum value assumed by A over all choices of n points. 1 Introduction From among \Gamma n 3 \Delta triangles with vertices chosen from among n points in the unit circle, let T be the one of least area, and let A be the area of T . Let \Delta n be the maximum assumed by A over all choices of n points. H.A. Heilbronn (19081975) 1 asked for the exact value or approximation of \Delta n . The list [1, 2, 3, 5, 8, 9, 10, 11,...
5. Using Finite Experiments to Study Asymptotic Performance
"... In the analysis of algorithms we are interested in obtaining closed form expressions for algorithmic complexity, or at least asymptotic expressions in O(·)notation. It is often possible to use experimental results to make significant progress towards this goal, although there are fundamental reason ..."
Abstract
 Add to MetaCart
In the analysis of algorithms we are interested in obtaining closed form expressions for algorithmic complexity, or at least asymptotic expressions in O(·)notation. It is often possible to use experimental results to make significant progress towards this goal, although there are fundamental reasons whywe cannot guarantee to obtain such expressions from experiments alone. This paper investigates two approaches relating to problems of developing theoretical analyses based on experimental data. We first consider the scientific method, which views experimentation as part of a cycle alternating with theoretical analysis. This approach has been verysuccessful in the natural sciences. Besides supplying preliminary ideas for theoretical analysis, experiments can test falsifiable hypotheses obtained by incomplete theoretical analysis. Asymptotic behavior can also sometimes be deduced from stronger hypotheses which have been induced from experiments. As long as complete mathematical analyses remains elusive, well tested hypotheses may have to take their place. Several examples