Results 1  10
of
12
On the Performance of WEAKHEAPSORT
, 2000
"... . Dutton #1993# presents a further HEAPSORT variant called WEAKHEAPSORT, which also contains a new data structure for priority queues. The sorting algorithm and the underlying data structure are analyzed showing that WEAKHEAPSORT is the best HEAPSORT variant and that it has a lot of nice propert ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
. Dutton #1993# presents a further HEAPSORT variant called WEAKHEAPSORT, which also contains a new data structure for priority queues. The sorting algorithm and the underlying data structure are analyzed showing that WEAKHEAPSORT is the best HEAPSORT variant and that it has a lot of nice properties. It is shown that the worst case number of comparisons is ndlog ne# 2 dlog ne + n #dlog ne#nlog n +0:1nand weak heaps can be generated with n # 1 comparisons. A doubleended priority queue based on weakheaps can be generated in n + dn=2e#2 comparisons. Moreover, examples for the worst and the best case of WEAKHEAPSORT are presented, the number of WeakHeaps on f1;:::;ng is determined, and experiments on the average case are reported. 1
Relaxed weak queues: an alternative to runrelaxed heaps
, 2005
"... Abstract. A simplification of a runrelaxed heap, called a relaxed weak queue, is presented. This new priorityqueue implementation supports all operations as efficiently as the original: findmin, insert, and decrease (also called decreasekey) in O(1) worstcase time, and delete in O(lg n) worstc ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
Abstract. A simplification of a runrelaxed heap, called a relaxed weak queue, is presented. This new priorityqueue implementation supports all operations as efficiently as the original: findmin, insert, and decrease (also called decreasekey) in O(1) worstcase time, and delete in O(lg n) worstcase time, n denoting the number of elements stored prior to the operation. These time bounds are valid on a pointer machine as well as on a randomaccess machine. A relaxed weak queue is a collection of at most ⌊lg n ⌋ + 1 perfect weak heaps, where there are in total at most ⌊lg n ⌋ + 1 nodes that may violate weakheap order. In a pointerbased representation of a perfect weak heap, which is a binary tree, it is enough to use two pointers per node to record parentchild relationships. Due to decrease, each node must store one additional pointer. The auxiliary data structures maintained to keep track of perfect weak heaps and potential violation nodes only require O(lg n) words of storage. That is, excluding the space used by the elements themselves, the total space usage of a relaxed weak queue can be as low as 3n + O(lg n) words. ACM CCS Categories and Subject Descriptors. E.1 [Data Structures]: Lists, stacks, and queues; E.2 [Data Storage Representations]: Linked representations;
Implementing HEAPSORT with n log n  0.9n and QUICKSORT with n log n + 0.2n Comparisons
 ACM Journal of Experimental Algorithms
, 2002
"... With refinements to the WEAKHEAPSORT... ..."
Pushing the Limits in Sequential Sorting
 Proceedings of the 4 th International Workshop on Algorithm Engineering (WAE 2000
, 2000
"... With refinements to the WEAKHEAPSORT algorithm we establish the general and practical relevant sequential sorting algorithm RELAXEDWEAKHEAPSORT executing exactly ndlog ne#2 dlog ne + 1 # n log n # 0:9n comparisons on any given input. The number of transpositions is bounded by n plus the number of ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
With refinements to the WEAKHEAPSORT algorithm we establish the general and practical relevant sequential sorting algorithm RELAXEDWEAKHEAPSORT executing exactly ndlog ne#2 dlog ne + 1 # n log n # 0:9n comparisons on any given input. The number of transpositions is bounded by n plus the number of comparisons. Experiments show that RELAXEDWEAKHEAPSORT only requires O(n) extra bits. Even if this space is not available, with QUICKWEAKHEAPSORT we propose an efficient QUICKSORT variant with n log n+0:2n+ o(n) comparisons on the average. Furthermore, we present data showing that WEAKHEAPSORT, RELAXEDWEAKHEAPSORT and QUICKWEAKHEAPSORT beat other performant QUICKSORT and HEAPSORT variants even for moderate values of n.
RankRelaxed Weak Queues: Faster than Pairing and Fibonacci Heaps?
, 2009
"... A runrelaxed weak queue by Elmasry et al. (2005) is a priority queue data structure with insert and decreasekey in O(1) as well as delete and deletemin in O(log n) worstcase time. One further advantage is the small space consumption of 3n + O(log n) pointers. In this paper we propose rankrelaxe ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
A runrelaxed weak queue by Elmasry et al. (2005) is a priority queue data structure with insert and decreasekey in O(1) as well as delete and deletemin in O(log n) worstcase time. One further advantage is the small space consumption of 3n + O(log n) pointers. In this paper we propose rankrelaxed weak queues, reducing the number of rank violations nodes for each level to a constant, while providing amortized constant time for decreasekey. Compared to runrelaxed weak queues, the new structure additionally gains one pointer per node. An empirical evaluation shows that the implementation can outperform Fibonacci and pairing heaps in practice even on rather simple data types.
A New Data Structure for Heapsort with Improved Number of Comparisons (Extended Abstract)
"... Abstract. In this paper we present a new data structure for implementing heapsort algorithm for pairs of which can be simultaneously stored and processed in a single register. Since time complexity of Carlsson type variants of heapsort has already achieved a leading coefficient of 1, concretely nlg ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper we present a new data structure for implementing heapsort algorithm for pairs of which can be simultaneously stored and processed in a single register. Since time complexity of Carlsson type variants of heapsort has already achieved a leading coefficient of 1, concretely nlg n + nlg lg n, and lower bound theory asserts that no comparison based inplace sorting algorithm can sort n data in less than ⌈lg(n!) ⌉ ≈ n lg n − 1.44n comparisons on the average, any improvement in the number of comparisons can only be achieved in lower terms. Our new data structure results in improvement in the linear term of the time complexity function irrespective of the variant of the heapsort algorithm used. This improvement is important in the context that some of the variants of heapsort algorithm, for example weak heapsort although not inplace, are near optimal and is away from the theoretical bound on number of comparisons by only 1.54n.
Quickheapsort: Modifications and improved analysis
 of Lecture Notes in Computer Science
, 2013
"... ar ..."
(Show Context)
Potentials and Limitations of Visual Methods for the Exploration of Complex Data Structures
, 2007
"... ..."
The WeakHeap Data Structure: Variants and Applications1
"... The weak heap is a priority queue that was introduced as a competitive structure for sorting. Its arraybased form supports the operations findmin in O(1) worstcase time, and insert and deletemin in O(lg n) worstcase time using at most dlg ne element comparisons. Additionally, its pointerbased ..."
Abstract
 Add to MetaCart
(Show Context)
The weak heap is a priority queue that was introduced as a competitive structure for sorting. Its arraybased form supports the operations findmin in O(1) worstcase time, and insert and deletemin in O(lg n) worstcase time using at most dlg ne element comparisons. Additionally, its pointerbased form supports delete and decrease in O(lg n) worstcase time using at most dlg ne element comparisons. In this paper we enhance this data structure as follows: 1. We improve the arraybased form to support insert in O(1) amortized time. The main idea is to temporarily store the inserted elements in a buffer, and, once the buffer is full, to move its elements to the heap using an efficient bulkinsertion procedure. As an application, we use this variant in the implementation of adaptive heapsort. Accordingly, we guarantee, for several measures of disorder, that the formula expressing the number of element comparisons performed by the algorithm is optimal up to the constant factor of the highorder term. Unlike other previous constant