Results 1 
6 of
6
A Survey of Adaptive Sorting Algorithms
, 1992
"... Introduction and Survey; F.2.2 [Analysis of Algorithms and Problem Complexity]: Nonnumerical Algorithms and Problems  Sorting and Searching; E.5 [Data]: Files  Sorting/searching; G.3 [Mathematics of Computing]: Probability and Statistics  Probabilistic algorithms; E.2 [Data Storage Represe ..."
Abstract

Cited by 65 (3 self)
 Add to MetaCart
Introduction and Survey; F.2.2 [Analysis of Algorithms and Problem Complexity]: Nonnumerical Algorithms and Problems  Sorting and Searching; E.5 [Data]: Files  Sorting/searching; G.3 [Mathematics of Computing]: Probability and Statistics  Probabilistic algorithms; E.2 [Data Storage Representation]: Composite structures, linked representations. General Terms: Algorithms, Theory. Additional Key Words and Phrases: Adaptive sorting algorithms, Comparison trees, Measures of disorder, Nearly sorted sequences, Randomized algorithms. A Survey of Adaptive Sorting Algorithms 2 CONTENTS INTRODUCTION I.1 Optimal adaptivity I.2 Measures of disorder I.3 Organization of the paper 1.WORSTCASE ADAPTIVE (INTERNAL) SORTING ALGORITHMS 1.1 Generic Sort 1.2 CookKim division 1.3 Partition Sort 1.4 Exponential Search 1.5 Adaptive Merging 2.EXPECTEDCASE ADAPTIV
Upper and Lower Bounds for Randomized Search Heuristics . . .
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY (ECCC
, 2004
"... Randomized search heuristics like local search, tabu search, simulated annealing or all kinds of evolutionary algorithms have many applications. However, for most problems the best worstcase expected run times are achieved by more problemspecific algorithms. This raises ..."
Abstract

Cited by 42 (6 self)
 Add to MetaCart
Randomized search heuristics like local search, tabu search, simulated annealing or all kinds of evolutionary algorithms have many applications. However, for most problems the best worstcase expected run times are achieved by more problemspecific algorithms. This raises
A framework for speeding up priorityqueue operations
, 2004
"... Abstract. We introduce a framework for reducing the number of element comparisons performed in priorityqueue operations. In particular, we give a priority queue which guarantees the worstcase cost of O(1) per minimum finding and insertion, and the worstcase cost of O(log n) with at most log n + O ..."
Abstract

Cited by 8 (8 self)
 Add to MetaCart
Abstract. We introduce a framework for reducing the number of element comparisons performed in priorityqueue operations. In particular, we give a priority queue which guarantees the worstcase cost of O(1) per minimum finding and insertion, and the worstcase cost of O(log n) with at most log n + O(1) element comparisons per minimum deletion and deletion, improving the bound of 2log n + O(1) on the number of element comparisons known for binomial queues. Here, n denotes the number of elements stored in the data structure prior to the operation in question, and log n equals max {1,log 2 n}. We also give a priority queue that provides, in addition to the abovementioned methods, the prioritydecrease (or decreasekey) method. This priority queue achieves the worstcase cost of O(1) per minimum finding, insertion, and priority decrease; and the worstcase cost of O(log n) with at most log n + O(log log n) element comparisons per minimum deletion and deletion. CR Classification. E.1 [Data Structures]: Lists, stacks, and queues; E.2 [Data
Adaptive techniques to find optimal planar boxes
 In CCCG
, 2012
"... Given a set P of n planar points, two axes and a realvalued score function f() on subsets of P, the Optimal Planar Box problem consists in finding a box (i.e. an axisaligned rectangle) H maximizing f(H ∩ P). We consider the case where f() is monotone decomposable, i.e. there exists a composition fu ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Given a set P of n planar points, two axes and a realvalued score function f() on subsets of P, the Optimal Planar Box problem consists in finding a box (i.e. an axisaligned rectangle) H maximizing f(H ∩ P). We consider the case where f() is monotone decomposable, i.e. there exists a composition function g() monotone in its two arguments such that f(A) = g(f(A1), f(A2)) for every subset A ⊆ P and every partition {A1, A2} of A. In this context we propose a solution for the Optimal Planar Box problem which performs in the worst case O(n2 lg n) score compositions and coordinate comparisons, and much less on other classes of instances defined by various measures of difficulty. A side result of its own interest is a fully dynamic MCS Splay tree data structure supporting insertions and deletions with the dynamic finger property, improving upon previous results [Cortés et al., J.Alg. 2009]. 1
and
"... We study the performance of the most practical inversionsensitive internal sorting algorithms. Experimental results illustrate that adaptive AVL sort consumes the fewest number of comparisons unless the number of inversions is less than 1%; in such case Splaysort consumes the fewest number of compa ..."
Abstract
 Add to MetaCart
We study the performance of the most practical inversionsensitive internal sorting algorithms. Experimental results illustrate that adaptive AVL sort consumes the fewest number of comparisons unless the number of inversions is less than 1%; in such case Splaysort consumes the fewest number of comparisons. On the other hand, the running time of Quicksort is superior unless the number of inversions is less than 1.5%; in such case Splaysort has the shortest running time. Another interesting result is that although the number of cache misses for the cacheoptimal Greedysort algorithm was the least, compared to other adaptive sorting algorithms under investigation, it was outperformed by Quicksort.
Electronic Colloquium on Computational Complexity, Report No. 48 (2003) Upper and Lower Bounds for Randomized Search Heuristics in BlackBox Optimization ∗
"... Randomized search heuristics like local search, tabu search, simulated annealing or all kinds of evolutionary algorithms have many applications. However, for most problems the best worstcase expected run times are achieved by more problemspecific algorithms. This raises the question about the limi ..."
Abstract
 Add to MetaCart
Randomized search heuristics like local search, tabu search, simulated annealing or all kinds of evolutionary algorithms have many applications. However, for most problems the best worstcase expected run times are achieved by more problemspecific algorithms. This raises the question about the limits of general randomized search heuristics. Here a framework called blackbox optimization is developed. The essential issue is that the problem but not the problem instance is known to the algorithm which can collect information about the instance only by asking for the value of points in the search space. All known randomized search heuristics fit into this scenario. Lower bounds on the blackbox complexity of problems are derived without complexity theoretical assumptions and are compared to upper bounds in this scenario. 1