Results 1  10
of
15
A Survey of Adaptive Sorting Algorithms
, 1992
"... Introduction and Survey; F.2.2 [Analysis of Algorithms and Problem Complexity]: Nonnumerical Algorithms and Problems  Sorting and Searching; E.5 [Data]: Files  Sorting/searching; G.3 [Mathematics of Computing]: Probability and Statistics  Probabilistic algorithms; E.2 [Data Storage Represe ..."
Abstract

Cited by 65 (3 self)
 Add to MetaCart
Introduction and Survey; F.2.2 [Analysis of Algorithms and Problem Complexity]: Nonnumerical Algorithms and Problems  Sorting and Searching; E.5 [Data]: Files  Sorting/searching; G.3 [Mathematics of Computing]: Probability and Statistics  Probabilistic algorithms; E.2 [Data Storage Representation]: Composite structures, linked representations. General Terms: Algorithms, Theory. Additional Key Words and Phrases: Adaptive sorting algorithms, Comparison trees, Measures of disorder, Nearly sorted sequences, Randomized algorithms. A Survey of Adaptive Sorting Algorithms 2 CONTENTS INTRODUCTION I.1 Optimal adaptivity I.2 Measures of disorder I.3 Organization of the paper 1.WORSTCASE ADAPTIVE (INTERNAL) SORTING ALGORITHMS 1.1 Generic Sort 1.2 CookKim division 1.3 Partition Sort 1.4 Exponential Search 1.5 Adaptive Merging 2.EXPECTEDCASE ADAPTIV
A New Framework for the Valuation of Algorithms for BlackBoxOptimization
, 2001
"... Blackbox optimization algorithms cannot use the specific parameters of the problem instance, i.e., of the fitness function f. Their run time is measured as the number of fevaluations. This implies that the usual algorithmic complexity of a problem cannot be used in the blackbox scenario. Therefor ..."
Abstract

Cited by 26 (12 self)
 Add to MetaCart
Blackbox optimization algorithms cannot use the specific parameters of the problem instance, i.e., of the fitness function f. Their run time is measured as the number of fevaluations. This implies that the usual algorithmic complexity of a problem cannot be used in the blackbox scenario. Therefore, a new framework for the valuation of algorithms for blackbox optimization is presented allowing the notion of the blackbox complexity of a problem. For several problems upper and lower bounds on their blackbox complexity are presented. Moreover, it can can be concluded that randomized search heuristics whose (worstcase) expected optimization time for some problem is close to the blackbox complexity of the problem are provably efficient (in the blackbox scenario). The new approach is applied to several problems based on typical example functions and further interesting problems. Run times of general EAs for these problems are compared with the blackbox complexity of the problem.
A framework for adaptive algorithm selection in STAPL
 IN PROC. ACM SIGPLAN SYMP. PRIN. PRAC. PAR. PROG. (PPOPP), PP 277–288
, 2005
"... Writing portable programs that perform well on multiple platforms or for varying input sizes and types can be very difficult because performance is often sensitive to the system architecture, the runtime environment, and input data characteristics. This is even more challenging on parallel and distr ..."
Abstract

Cited by 21 (5 self)
 Add to MetaCart
Writing portable programs that perform well on multiple platforms or for varying input sizes and types can be very difficult because performance is often sensitive to the system architecture, the runtime environment, and input data characteristics. This is even more challenging on parallel and distributed systems due to the wide variety of system architectures. One way to address this problem is to adaptively select the best parallel algorithm for the current input data and system from a set of functionally equivalent algorithmic options. Toward this goal, we have developed a general framework for adaptive algorithm selection for use in the Standard Template Adaptive Parallel Library (STAPL). Our framework uses machine learning techniques to analyze data collected by STAPL installation benchmarks and to determine tests that will select among algorithmic options at runtime. We apply a prototype implementation of our framework to two important parallel operations, sorting and matrix multiplication, on multiple platforms and show that the framework determines runtime tests that correctly select the best performing algorithm from among several competing algorithmic options in 86100 % of the cases studied, depending on the operation and the system.
On the adaptiveness of quicksort
 IN: WORKSHOP ON ALGORITHM ENGINEERING & EXPERIMENTS, SIAM
, 2005
"... Quicksort was first introduced in 1961 by Hoare. Many variants have been developed, the best of which are among the fastest generic sorting algorithms available, as testified by the choice of Quicksort as the default sorting algorithm in most programming libraries. Some sorting algorithms are adapti ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Quicksort was first introduced in 1961 by Hoare. Many variants have been developed, the best of which are among the fastest generic sorting algorithms available, as testified by the choice of Quicksort as the default sorting algorithm in most programming libraries. Some sorting algorithms are adaptive, i.e. they have a complexity analysis which is better for inputs which are nearly sorted, according to some specified measure of presortedness. Quicksort is not among these, as it uses Ω(n log n) comparisons even when the input is already sorted. However, in this paper we demonstrate empirically that the actual running time of Quicksort is adaptive with respect to the presortedness measure Inv. Differences close to a factor of two are observed between instances with low and high Inv value. We then show that for the randomized version of Quicksort, the number of element swaps performed is provably adaptive with respect to the measure Inv. More precisely, we prove that randomized Quicksort performs expected O(n(1+log(1+ Inv/n))) element swaps, where Inv denotes the number of inversions in the input sequence. This result provides a theoretical explanation for the observed behavior, and gives new insights on the behavior of the Quicksort algorithm. We also give some empirical results on the adaptive behavior of Heapsort and Mergesort.
Sorting and/by Merging Finger Trees
 In Algorithms and Computation: Third International Symposium, ISAAC ’92
, 1992
"... : We describe a sorting algorithm that is optimally adaptive with respect to several important measures of presortedness. In particular, the algorithm requires O(n+k log k) time on nsequences X that have a longest ascending subsequence of length n \Gamma k and for which Rem(X) = k; O(n log(k=n)) ti ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
: We describe a sorting algorithm that is optimally adaptive with respect to several important measures of presortedness. In particular, the algorithm requires O(n+k log k) time on nsequences X that have a longest ascending subsequence of length n \Gamma k and for which Rem(X) = k; O(n log(k=n)) time on sequences with k inversions; and O(n log k) time on sequences that can be decomposed into k monotone shuffles. The algorithm makes use of an adaptive merging operation that can be implemented using finger search trees. 1 Introduction An adaptive algorithm is one which requires fewer resources to solve `easy' problem instances than it does to solve `hard'. For sorting an adaptive algorithm should run in O(n) time if presented with a sorted nsequence, and in O(n log n) time for all n sequences, with the time for any particular sequence depending upon the `nearness' of the sequence to being sorted. Mannila [7] established the notion of a measure of presortedness to quantify the disord...
Deterministic algorithm for the tthreshold set problem
 Lecture Notes in Computer Science
, 2003
"... Abstract. Given k sorted arrays, the tThreshold problem, which is motivated by indexed search engines, consists of finding the elements which are present in at least t of the arrays. We present a new deterministic algorithm for it and prove that, asymptotically in the sizes of the arrays, it is opt ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Abstract. Given k sorted arrays, the tThreshold problem, which is motivated by indexed search engines, consists of finding the elements which are present in at least t of the arrays. We present a new deterministic algorithm for it and prove that, asymptotically in the sizes of the arrays, it is optimal in the alternation model used to study adaptive algorithms. We define the OptThreshold problem as finding the smallest non empty tthreshold set, which is equivalent to find the largest t such that the tthreshold set is non empty, and propose a naive algorithm to solve it.
Presorting Algorithms: An AverageCase Point of View
, 1998
"... We introduce the concept of presorting algorithms, quantifying and evaluating the performance of such algorithms with the average reduction in number of inversions. Stages of wellknown algorithms such as Shellsort and quicksort are evaluated in such a framework and shown to cause a meaning drop in ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We introduce the concept of presorting algorithms, quantifying and evaluating the performance of such algorithms with the average reduction in number of inversions. Stages of wellknown algorithms such as Shellsort and quicksort are evaluated in such a framework and shown to cause a meaning drop in the inversion statistic. The expected value, variance and generating function for the decrease in number of inversions are computed. The possibility of "presorting" a sorting algorithm is also investigated under a similar framework.
LRMTrees: Compressed Indices, Adaptive Sorting, and Compressed Permutations ⋆
"... Abstract. LRMTrees are an elegant way to partition a sequence of values into sorted consecutive blocks, and to express the relative position of the first element of each block within a previous block. They were used to encode ordinal trees and to index integer arrays in order to support range minim ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. LRMTrees are an elegant way to partition a sequence of values into sorted consecutive blocks, and to express the relative position of the first element of each block within a previous block. They were used to encode ordinal trees and to index integer arrays in order to support range minimum queries on them. We describe how they yield many other convenient results in a variety of areas: compressed succinct indices for range minimum queries on partially sorted arrays; a new adaptive sorting algorithm; and a compressed succinct data structure for permutations supporting direct and inverse application in time inversely proportional to the permutation’s compressibility. 1
Sorting a LowEntropy Sequence
, 2005
"... We give the first sorting algorithm with bounds in terms of higherorder entropies: let S be a sequence of length m containing n distinct elements and let H # (S) be the #thorder empirical entropy of S, log n # O(m); our algorithm sorts S using (H # (S) + O(1))m comparisons. ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We give the first sorting algorithm with bounds in terms of higherorder entropies: let S be a sequence of length m containing n distinct elements and let H # (S) be the #thorder empirical entropy of S, log n # O(m); our algorithm sorts S using (H # (S) + O(1))m comparisons.