Results 1 
6 of
6
Improving Partial Rebuilding by Using Simple Balance Criteria
"... Some new classes of balanced trees, defined by very simple balance criteria, are introduced. Those trees can be maintained by partial rebuilding at lower update cost than previously used weightbalanced trees. The used balance criteria also allow us to maintain a balanced tree without any balance in ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
Some new classes of balanced trees, defined by very simple balance criteria, are introduced. Those trees can be maintained by partial rebuilding at lower update cost than previously used weightbalanced trees. The used balance criteria also allow us to maintain a balanced tree without any balance information stored in the nodes.
Balanced search trees made simple
 In Proc. 3rd Workshop on Algorithms and Data Structures
, 1993
"... Abstract. As a contribution to the recent debate on simple implementations of dictionaries, we present new maintenance algorithms for balanced trees. In terms of code simplicity, our algorithms compare favourably with those for deterministic and probabilistic skip lists. ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
Abstract. As a contribution to the recent debate on simple implementations of dictionaries, we present new maintenance algorithms for balanced trees. In terms of code simplicity, our algorithms compare favourably with those for deterministic and probabilistic skip lists.
General balanced trees
 Journal of Algorithms
, 1999
"... We show that, in order to achieve efficient maintenance of a balanced binary search tree, no shape restriction other than a logarithmic height is required. The obtained class of trees, general balanced trees, may be maintained at a logarithmic amortized cost with no balance information stored in the ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
We show that, in order to achieve efficient maintenance of a balanced binary search tree, no shape restriction other than a logarithmic height is required. The obtained class of trees, general balanced trees, may be maintained at a logarithmic amortized cost with no balance information stored in the nodes. Thus, in the case when amortized bounds are sufficient, there is no need for sophisticated balance criteria. The maintenance algorithms use partial rebuilding. This is important for certain applications and has previously been used with weightbalanced trees. We show that the amortized cost incurred by general balanced trees is lower than what has been shown for weightbalanced trees. � 1999 Academic Press 1.
Binary Search Trees of Almost Optimal Height
 ACTA INFORMATICA
, 1990
"... First we present a generalization of symmetric binary Btrees, SBB(k) trees. The obtained structure has a height of only \Sigma (1 + 1k) log(n + 1)\Upsilon, where k may be chosen to be any positive integer. The maintenance algorithms require only a constant number of rotations per updating operati ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
First we present a generalization of symmetric binary Btrees, SBB(k) trees. The obtained structure has a height of only \Sigma (1 + 1k) log(n + 1)\Upsilon, where k may be chosen to be any positive integer. The maintenance algorithms require only a constant number of rotations per updating operation in the worst case. These properties together with the fact that the structure is relatively simple to implement makes it a useful alternative to other search trees in practical applications. Then, by using an SBB(k)tree with a varying k we achieve a structure with a logarithmic amortized cost per update and a height of log n + o(log n). This result is an improvement of the upper bound on the height of a dynamic binary search tree. By maintaining two trees simultaneously the amortized cost is transformed into a worstcase cost. Thus, we have improved the worstcase complexity of the dictionary problem.
Optimal Bounds on the Dictionary Problem
 In Proc. Symp. on Optimal Algorithms, Varna, volume 401 of LNCS
, 1989
"... A new data structure for the dictionary problem is presented. Updates are performed in \Theta(log n) time in the worst case and the number of comparisons per operation is dlog n + 1 + ffle, where ffl is an arbitrary positive constant. 1 Introduction One of the fundamental and most studied problems ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
A new data structure for the dictionary problem is presented. Updates are performed in \Theta(log n) time in the worst case and the number of comparisons per operation is dlog n + 1 + ffle, where ffl is an arbitrary positive constant. 1 Introduction One of the fundamental and most studied problems in computer science is the dictionary problem, that is the problem of how to maintain a set of data during the operations search, insert and delete. It is well known that in a comparisonbased model the lower bound on these operations is dlog(n + 1)e comparisons both in the average and in the worst case. This bound can be achieved by storing the set in an array or in a perfectly balanced binary search tree. However, for both these data structures the overhead cost per update is high, \Theta(n) in the worst case. An efficient dynamic data structure for the dictionary problem should have a worst case cost of \Theta(log n) per operation. The first efficient solution was presented by AdelsonVel...
Binary search trees: How low can you go?
 SWAT'96, LNCS
, 1996
"... We prove that no algorithm for balanced binary search trees performing insertions and deletions in amortized time O(f(n)) can guarantee a height smaller than dlog(n + 1) + 1=f(n)e for all n. We improve the existing upper bound to dlog(n + 1) + log 2 (f(n))=f(n)e, thus almost matching our lower boun ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We prove that no algorithm for balanced binary search trees performing insertions and deletions in amortized time O(f(n)) can guarantee a height smaller than dlog(n + 1) + 1=f(n)e for all n. We improve the existing upper bound to dlog(n + 1) + log 2 (f(n))=f(n)e, thus almost matching our lower bound. We also improve the existing upper bound for worst case algorithms, and give a lower bound for the semidynamic case.