Results 1  10
of
13
A Survey of Adaptive Sorting Algorithms
, 1992
"... Introduction and Survey; F.2.2 [Analysis of Algorithms and Problem Complexity]: Nonnumerical Algorithms and Problems  Sorting and Searching; E.5 [Data]: Files  Sorting/searching; G.3 [Mathematics of Computing]: Probability and Statistics  Probabilistic algorithms; E.2 [Data Storage Represe ..."
Abstract

Cited by 65 (3 self)
 Add to MetaCart
Introduction and Survey; F.2.2 [Analysis of Algorithms and Problem Complexity]: Nonnumerical Algorithms and Problems  Sorting and Searching; E.5 [Data]: Files  Sorting/searching; G.3 [Mathematics of Computing]: Probability and Statistics  Probabilistic algorithms; E.2 [Data Storage Representation]: Composite structures, linked representations. General Terms: Algorithms, Theory. Additional Key Words and Phrases: Adaptive sorting algorithms, Comparison trees, Measures of disorder, Nearly sorted sequences, Randomized algorithms. A Survey of Adaptive Sorting Algorithms 2 CONTENTS INTRODUCTION I.1 Optimal adaptivity I.2 Measures of disorder I.3 Organization of the paper 1.WORSTCASE ADAPTIVE (INTERNAL) SORTING ALGORITHMS 1.1 Generic Sort 1.2 CookKim division 1.3 Partition Sort 1.4 Exponential Search 1.5 Adaptive Merging 2.EXPECTEDCASE ADAPTIV
Spaceefficient planar convex hull algorithms
 Proc. Latin American Theoretical Informatics
, 2002
"... A spaceefficient algorithm is one in which the output is given in the same location as the input and only a small amount of additional memory is used by the algorithm. We describe four spaceefficient algorithms for computing the convex hull of a planar point set. ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
A spaceefficient algorithm is one in which the output is given in the same location as the input and only a small amount of additional memory is used by the algorithm. We describe four spaceefficient algorithms for computing the convex hull of a planar point set.
Asymptotically Efficient inPlace Merging
 Theoretical Computer Science
"... Two lineartime algorithms for inplace merging are presented. Both algorithms perform at most m(t+1)+n=2 t +o(m) comparisons, where m and n are the sizes of the input sequences, m n, and t = blog 2 (n=m)c. The first algorithm is for unstable merging and it carries out no more than 3(n+m)+o(m) el ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
Two lineartime algorithms for inplace merging are presented. Both algorithms perform at most m(t+1)+n=2 t +o(m) comparisons, where m and n are the sizes of the input sequences, m n, and t = blog 2 (n=m)c. The first algorithm is for unstable merging and it carries out no more than 3(n+m)+o(m) element moves. The second algorithm is for stable merging and it accomplishes at most 5n+12m+o(m) moves. Key words: Inplace algorithms, merging, sorting ? A preliminary and weaker version of this work appeared in Proceedings of the 20th Symposium on Mathematical Foundations of Computer Science, Lecture Notes in Computer Science 969, SpringerVerlag, Berlin/Heidelberg (1995), 211220. 1 Supported by the Slovak Grant Agency for Science under contract 1/4376/97 (Project "Combinational Structures and Complexity of Algorithms"). 2 Partially supported by the Danish Natural Science Research Council under contracts 9400952 (Project "Computational Algorithmics") and 9701414 (Project "Experimental Algorithmics"). Preprint submitted to Elsevier Preprint December 19, 1995 1
Practical InPlace Mergesort
, 1996
"... Two inplace variants of the classical mergesort algorithm are analysed in detail. The first, straightforward variant performs at most N log 2 N + O(N ) comparisons and 3N log 2 N + O(N ) moves to sort N elements. The second, more advanced variant requires at most N log 2 N + O(N ) comparisons and " ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Two inplace variants of the classical mergesort algorithm are analysed in detail. The first, straightforward variant performs at most N log 2 N + O(N ) comparisons and 3N log 2 N + O(N ) moves to sort N elements. The second, more advanced variant requires at most N log 2 N + O(N ) comparisons and "N log 2 N moves, for any fixed " ? 0 and any N ? N ("). In theory, the second one is superior to advanced versions of heapsort. In practice, due to the overhead in the index manipulation, our fastest inplace mergesort behaves still about 50 per cent slower than the bottomup heapsort. However, our implementations are practical compared to mergesort algorithms based on inplace merging. Key words: sorting, mergesort, inplace algorithms CR Classification: F.2.2 1.
Fast Stable Merging And Sorting In Constant Extra Space
, 1990
"... In an earlier research paper [HL1], we presented a novel, yet straightforward lineartime algorithm for merging two sorted lists in a fixed amount of additional space. Constant of proportionality estimates and empirical testing reveal that this procedure is reasonably competitive with merge routines ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
In an earlier research paper [HL1], we presented a novel, yet straightforward lineartime algorithm for merging two sorted lists in a fixed amount of additional space. Constant of proportionality estimates and empirical testing reveal that this procedure is reasonably competitive with merge routines free to squander unbounded additional memory, making it particularly attractive whenever space is a critical resource. In this paper, we devise a relatively simple strategy by which this efficient merge can be made stable, and extend our results in a nontrivial way to the problem of stable sorting by merging. We also derive upper bounds on our algorithms' constants of proportionality, suggesting that in some environments (most notably external file processing) their modest runtime premiums may be more than offset by the dramatic space savings achieved.
Sorting inplace with a worst case complexity of n log n  1:3n + O(log n) comparisons and ffln log n +O(1) transports
 LNCS
, 1992
"... First we present a new variant of Mergesort, which needs only 1.25n space, because it uses space again, which becomes available within the current stage. It does not need more comparisons than classical Mergesort. The main result is an easy to implement method of iterating the procedure inplace s ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
First we present a new variant of Mergesort, which needs only 1.25n space, because it uses space again, which becomes available within the current stage. It does not need more comparisons than classical Mergesort. The main result is an easy to implement method of iterating the procedure inplace starting to sort 4/5 of the elements. Hereby we can keep the additional transport costs linear and only very few comparisons get lost, so that n log n − 0.8n comparisons are needed. We show that we can improve the number of comparisons if we sort blocks of constant length with MergeInsertion, before starting the algorithm. Another improvement is to start the iteration with a better version, which needs only (1+ε)n space and again additional O(n) transports. The result is, that we can improve this theoretically up to n log n − 1.3289n comparisons in the worst case. This is close to the theoretical lower bound of n log n − 1.443n. The total number of transports in all these versions can be reduced to ε n log n+O(1) for any ε> 0. 1
Optimal inplace planar convex hull algorithms
 Proceedings of Latin American Theoretical Informatics (LATIN 2002), volume 2286 of Lecture Notes in Computer Science
, 2002
"... An inplace algorithm is one in which the output is given in the same location as the input and only a small amount of additional memory is used by the algorithm. In this paper we describe three inplace algorithms for computing the convex hull of a planar point set. All three algorithms are optima ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
An inplace algorithm is one in which the output is given in the same location as the input and only a small amount of additional memory is used by the algorithm. In this paper we describe three inplace algorithms for computing the convex hull of a planar point set. All three algorithms are optimal, some more so than others...
A GeneralPurpose Parallel Sorting Algorithm
, 1995
"... A parallel sorting algorithm is presented for general purpose internal sorting on MIMD machines. The algorithm initially sorts the elements within each node using a serial sorting algorithm, then proceeds with a twophase parallel merge. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
A parallel sorting algorithm is presented for general purpose internal sorting on MIMD machines. The algorithm initially sorts the elements within each node using a serial sorting algorithm, then proceeds with a twophase parallel merge.
A.: Stable minimum storage merging by symmetric comparisons
 Algorithms  ESA 2004. Volume 3221 of Lecture Notes in Computer Science
, 2004
"... Abstract. We introduce a new stable minimum storage algorithm for merging that needs O(m log ( n + 1)) element comparisons, where m and m n are the sizes of the input sequences with m ≤ n. According to the lower bound for merging, our algorithm is asymptotically optimal regarding the number of compa ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract. We introduce a new stable minimum storage algorithm for merging that needs O(m log ( n + 1)) element comparisons, where m and m n are the sizes of the input sequences with m ≤ n. According to the lower bound for merging, our algorithm is asymptotically optimal regarding the number of comparisons. The presented algorithm rearranges the elements to be merged by rotations, where the areas to be rotated are determined by a simple principle of symmetric comparisons. This style of minimum storage merging is novel and looks promising. Our algorithm has a short and transparent definition. Experimental work has shown that it is very efficient and so might be of high practical interest. 1