Results 1  10
of
12
Load Balancing, Selection and Sorting on the Hypercube
 In Proceedings of the 1989 ACM Symposium on Parallel Algorithms and Architectures
, 1989
"... This paper presents novel load balancing, selection and sorting algorithms for the hypercube with 1port communication. The main result is an algorithm for sorting n values on p processors, SmoothSort, that runs asymptotically faster (in the worst case) than any previously known algorithm over a ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
This paper presents novel load balancing, selection and sorting algorithms for the hypercube with 1port communication. The main result is an algorithm for sorting n values on p processors, SmoothSort, that runs asymptotically faster (in the worst case) than any previously known algorithm over a wide range of the ratio n=p. The load balancing and selection algorithms upon which SmoothSort is based are expected to be of independent interest. Although the analysis of our algorithms is limited to obtaining asymptotic bounds, the constant factors being ignored are quite small. 1 Introduction This paper presents novel load balancing, selection and sorting algorithms for the hypercube. The following model of computation is assumed. Each processor has an infinite local memory configured in O(log p) bit words and can perform the usual set of ALU operations in constant time on wordsized operands. Processors communicate with one another by sending packets over the bidirectional channel...
Lower bounds for external memory dictionaries
 IN PROCEEDINGS OF THE 14TH ANNUAL ACMSIAM SYMPOSIUM ON DISCRETE ALGORITHMS (SODA
, 2003
"... We study tradeoffs between the update time and the query time for comparisonbased external memory dictionaries. The main contributions of this paper are two lower bound tradeoffs between the I/O complexity of member queries and insertions: If N> M insertions perform at most ffi * N/B I/Os, the ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
We study tradeoffs between the update time and the query time for comparisonbased external memory dictionaries. The main contributions of this paper are two lower bound tradeoffs between the I/O complexity of member queries and insertions: If N> M insertions perform at most ffi * N/B I/Os, then (1) there exists a query requiring N/(M * ( MB)O(ffi)) I/Os, and (2) there exists a query requiring \Omega (logffi log2 N NM) I/Os when ffi is O(B / log3 N) and N is at least M 2. For both lower bounds we describe data structureswhich give matching upper bounds for a wide range of parameters, thereby showing the lower bounds to be tight within these ranges.
The complexity of constructing evolutionary trees using experiments
, 2001
"... We present tight upper and lower bounds for the problem of constructing evolutionary trees in the experiment model. We describe an algorithm which constructs an evolutionary tree of n species in time O(nd log d n) using at most n⌈d/2⌉(log 2⌈d/2⌉−1 n+O(1)) experiments for d> 2, and at most n(log ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
We present tight upper and lower bounds for the problem of constructing evolutionary trees in the experiment model. We describe an algorithm which constructs an evolutionary tree of n species in time O(nd log d n) using at most n⌈d/2⌉(log 2⌈d/2⌉−1 n+O(1)) experiments for d> 2, and at most n(log n+O(1)) experiments for d = 2, where d is the degree of the tree. This improves the previous best upper bound by a factor Θ(log d). For d = 2 the previously best algorithm with running time O(n log n) had a bound of 4n log n on the number of experiments. By an explicit adversary argument, we show an Ω(nd log d n) lower bound, matching our upper bounds and improving the previous best lower bound by a factor Θ(log d n). Central to our algorithm is the construction and maintenance of separator trees of small height, which may be of independent interest.
The Randomized Complexity of Maintaining the Minimum
, 1996
"... . The complexity of maintaining a set under the operations Insert, Delete and FindMin is considered. In the comparison model it is shown that any randomized algorithm with expected amortized cost t comparisons per Insert and Delete has expected cost at least n=(e2 2t ) \Gamma 1 comparisons for Fi ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
. The complexity of maintaining a set under the operations Insert, Delete and FindMin is considered. In the comparison model it is shown that any randomized algorithm with expected amortized cost t comparisons per Insert and Delete has expected cost at least n=(e2 2t ) \Gamma 1 comparisons for FindMin. If FindMin is replaced by a weaker operation, FindAny, then it is shown that a randomized algorithm with constant expected cost per operation exists; in contrast, it is shown that no deterministic algorithm can have constant cost per operation. Finally, a deterministic algorithm with constant amortized cost per operation for an offline version of the problem is given. CR Classification: F.2.2 1. Introduction We consider the complexity of maintaining a set S of elements from a totally ordered universe under the following operations: Insert(x): inserts the element x into S, Delete(x): removes from S the element x provided it is known where x is stored, and Supported by the Danish...
On the Comparison Cost of Partial Orders
, 1992
"... A great deal of effort has been directed towards determining the minimum number of binary comparisons sufficient to produce various partial orders given some partial order. For example, the sorting problem considers the minimum number of comparisons sufficient to construct a total order starting fro ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
A great deal of effort has been directed towards determining the minimum number of binary comparisons sufficient to produce various partial orders given some partial order. For example, the sorting problem considers the minimum number of comparisons sufficient to construct a total order starting from n elements. The merging problem considers the minimum number of comparisons sufficient to construct a total order from two total orders. The searching problem can be seen as a special case of the merging problem in which one of the total orders is a singleton. The selection problem considers the minimum number of comparisons sufficient to select the i th largest of n elements. Little, however, is known about the minimum number of comparisons sufficient to produce an arbitrary partial order. In this paper we briefly survey the known results on this problem and we present some first results on partial orders which can be produced using either restricted types of comparisons or a limited n...
AN EFFICIENT SEARCH ALGORITHM FOR PARTIALLY ORDERED SETS
, 2006
"... Consider the problem of membership query for a given partially ordered set. We devise a greedy algorithm which can produce nearoptimal search strategies. Rigorous analysis has been given, which shows our algorithm can have fewer comparisons than the best known solution by at least a factor of 0.27 ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Consider the problem of membership query for a given partially ordered set. We devise a greedy algorithm which can produce nearoptimal search strategies. Rigorous analysis has been given, which shows our algorithm can have fewer comparisons than the best known solution by at least a factor of 0.27 under random graph model. Experimental results have also been given, which suggest the advantage of the algorithm under other models.
AN EFFICIENT SEARCH ALGORITHM FOR PARTIALLY ORDERED SETS
"... Consider the problem of membership query for a given partially ordered set. We devise a greedy algorithm which can produce nearoptimal search strategies. Rigorous analysis has been given, which shows our algorithm can have fewer comparisons than the best known solution by at least a factor of 0.27 ..."
Abstract
 Add to MetaCart
Consider the problem of membership query for a given partially ordered set. We devise a greedy algorithm which can produce nearoptimal search strategies. Rigorous analysis has been given, which shows our algorithm can have fewer comparisons than the best known solution by at least a factor of 0.27 under random graph model. Experimental results have also been given, which suggest the advantage of the algorithm under other models. KEY WORDS algorithms and computation theories, membership query, partial order, greedy algorithm 1
On Searching a Table Consistent with Division Poset ∗
"... Suppose Pn = {1, 2,..., n} is a partially ordered set with the partial order defined by divisibility, that is, for any two elements i, j ∈ Pn satisfying i divides j, we have i ≤Pn j. A table An = {aii = 1, 2,..., n} of real numbers is said to be consistent with Pn, provided for any two elements i, ..."
Abstract
 Add to MetaCart
Suppose Pn = {1, 2,..., n} is a partially ordered set with the partial order defined by divisibility, that is, for any two elements i, j ∈ Pn satisfying i divides j, we have i ≤Pn j. A table An = {aii = 1, 2,..., n} of real numbers is said to be consistent with Pn, provided for any two elements i, j ∈ {1, 2,..., n} satisfying i divides j, ai ≤ aj. Given an real number x, we want to determine whether x ∈ An, by comparing x with as few entries of An as possible. In this paper we investigate the complexity τ(n), measured in the number of comparisons, of the above search problem. We present a 55n 72 + O(ln2 n) search algorithm for An and prove a lower bound)n + O(1) on τ(n) by using an adversary argument.
The Randomized Complexity of Maintaining the Minimum
 Nordic Journal of Computing, Selected Papers of the 5th Scandinavian Workshop on Algorithm Theory (SWAT
, 1996
"... The complexity of maintaining a set under the operations Insert, Delete and FindMin is considered. In the comparison model it is shown that any randomized algorithm with expected amortized cost t comparisons per Insert and Delete has expected cost at least n=(e2 2t ) \Gamma 1 comparisons for Fin ..."
Abstract
 Add to MetaCart
The complexity of maintaining a set under the operations Insert, Delete and FindMin is considered. In the comparison model it is shown that any randomized algorithm with expected amortized cost t comparisons per Insert and Delete has expected cost at least n=(e2 2t ) \Gamma 1 comparisons for FindMin. If FindMin is replaced by a weaker operation, FindAny, then it is shown that a randomized algorithm with constant expected cost per operation exists; in contrast, it is shown that no deterministic algorithm can have constant cost per operation. Finally, a deterministic algorithm with constant amortized cost per operation for an offline version of the problem is given. Supported by the Danish Natural Science Research Council (Grant No. 9400044). Partially supported by the ESPRIT Long Term Research Program of the EU under contract #20244 (ALCOMIT). This research was done while visiting the MaxPlanck Institut fur Informatik, Saabrucken, Germany. Email: gerth@daimi.aau.dk. y Basic...