Results 1 
4 of
4
Amortization, Lazy Evaluation, and Persistence: Lists with Catenation via Lazy Linking
 Pages 646654 of: IEEE Symposium on Foundations of Computer Science
, 1995
"... Amortization has been underutilized in the design of persistent data structures, largely because traditional accounting schemes break down in a persistent setting. Such schemes depend on saving "credits" for future use, but a persistent data structure may have multiple "futures", ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
Amortization has been underutilized in the design of persistent data structures, largely because traditional accounting schemes break down in a persistent setting. Such schemes depend on saving "credits" for future use, but a persistent data structure may have multiple "futures", each competing for the same credits. We describe how lazy evaluation can often remedy this problem, yielding persistent data structures with good amortized efficiency. In fact, such data structures can be implemented purely functionally in any functional language supporting lazy evaluation. As an example of this technique, we present a purely functional (and therefore persistent) implementation of lists that simultaneously support catenation and all other usual list primitives in constant amortized time. This data structure is much simpler than the only existing data structure with comparable bounds, the recently discovered catenable lists of Kaplan and Tarjan, which support all operations in constant worstca...
MAXIMUM SIZE OF A DYNAMIC DATA STRUCTURE: Hashing with Lazy Deletion Revisited
, 1992
"... We study the dynamic data structure management technique called Hashing with Lazy Deletion (HwLD). A table managed under HwLD is built via a sequence of insertions and deletions of items. When hashing with lazy deletions, one does not delete items as soon as possible, but keeps more items in the dat ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
We study the dynamic data structure management technique called Hashing with Lazy Deletion (HwLD). A table managed under HwLD is built via a sequence of insertions and deletions of items. When hashing with lazy deletions, one does not delete items as soon as possible, but keeps more items in the data structure than immediatedeletion strategies would. This deferral allows the use of a simpler deletion algorithm, leading to a lower overheadin space and timefor the HwLD implementation. It is of interest to know how much extra space is used by HwLD. We investigate the maximum size and the excess space used by HwLD, under general probabilistic assumptions, using the methodology of queueing theory. In particular, we find that for the Poisson arrivals and general lifetime distribution of items, the excess space does not exceed the number of buckets in HwLD. As a byproduct of our analysis, we also derive the limiting distribution of the maximum queue length in an M jGj1 queueing syst...
Sorting using complete subintervals and the maximum number of runs in a randomly evolving sequence
 Ann. Comb
"... Abstract. We study the space requirements of a sorting algorithm where only items that at the end will be adjacent are kept together. This is equivalent to the following combinatorial problem: Consider a string of fixed length n that starts as a string of 0’s, and then evolves by changing each 0 to ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Abstract. We study the space requirements of a sorting algorithm where only items that at the end will be adjacent are kept together. This is equivalent to the following combinatorial problem: Consider a string of fixed length n that starts as a string of 0’s, and then evolves by changing each 0 to 1, with the n changes done in random order. What is the maximal number of runs of 1’s? We give asymptotic results for the distribution and mean. It turns out that, as in many problems involving a maximum, the maximum is asymptotically normal, with fluctuations of order n 1/2, and to the first order well approximated by the number of runs at the instance when the expectation is maximized, in this case when half the elements have changed to 1; there is also a second order term of order n 1/3. We also treat some variations, including priority queues. The proofs use methods originally developed for random graphs. 1.