Results 1  10
of
23
Dynamical Sources in Information Theory: A General Analysis of Trie Structures
 ALGORITHMICA
, 1999
"... Digital trees, also known as tries, are a general purpose flexible data structure that implements dictionaries built on sets of words. An analysis is given of three major representations of tries in the form of arraytries, list tries, and bsttries ("ternary search tries"). The size and the sear ..."
Abstract

Cited by 50 (7 self)
 Add to MetaCart
Digital trees, also known as tries, are a general purpose flexible data structure that implements dictionaries built on sets of words. An analysis is given of three major representations of tries in the form of arraytries, list tries, and bsttries ("ternary search tries"). The size and the search costs of the corresponding representations are analysed precisely in the average case, while a complete distributional analysis of height of tries is given. The unifying data model used is that of dynamical sources and it encompasses classical models like those of memoryless sources with independent symbols, of finite Markovchains, and of nonuniform densities. The probabilistic behaviour of the main parameters, namely size, path length, or height, appears to be determined by two intrinsic characteristics of the source: the entropy and the probability of letter coincidence. These characteristics are themselves related in a natural way to spectral properties of specific transfer operators of the Ruelle type.
Analysis of the binary Euclidean algorithm
 Directions and Recent Results in Algorithms and Complexity
, 1976
"... The binary Euclidean algorithm is a variant of the classical Euclidean algorithm. It avoids multiplications and divisions, except by powers of two, so is potentially faster than the classical algorithm on a binary machine. We describe the binary algorithm and consider its average case behaviour. In ..."
Abstract

Cited by 28 (2 self)
 Add to MetaCart
The binary Euclidean algorithm is a variant of the classical Euclidean algorithm. It avoids multiplications and divisions, except by powers of two, so is potentially faster than the classical algorithm on a binary machine. We describe the binary algorithm and consider its average case behaviour. In particular, we correct some errors in the literature, discuss some recent results of Vallée, and describe a numerical computation which supports a conjecture of Vallée. 1
Dynamical Sources in Information Theory: Fundamental intervals and Word Prefixes.
, 1998
"... A quite general model of source that comes from dynamical systems theory is introduced. Within this model, some important problems about prefixes that intervene in algorithmic information theory contexts are analysed. The main tool is a new object, the generalized Ruelle operator, which can be viewe ..."
Abstract

Cited by 28 (7 self)
 Add to MetaCart
A quite general model of source that comes from dynamical systems theory is introduced. Within this model, some important problems about prefixes that intervene in algorithmic information theory contexts are analysed. The main tool is a new object, the generalized Ruelle operator, which can be viewed as a "generating" operator. Its dominant spectral objects are linked with important parameters of the source such as the entropy, and play a central role in all the results. 1 Introduction. In information theory contexts, data items are (infinite) words that are produced by a common mechanism, called a source. Realistic sources are often complex objects. We work here inside a quite general framework of sources related to dynamical systems theory which goes beyond the cases of memoryless and Markov sources. This model can describe nonmarkovian processes, where the dependency on past history is unbounded, and as such, they attain a high level of generality. A probabilistic dynamical source ...
The Analysis of Hybrid Trie Structures
, 1998
"... This paper provides a detailed analysis of various implementations of digital tries, including the “ternary search tries” of Bentley and Sedgewick. The methods employed combine symbolic uses of generating functions, Poisson models, and MeIlin transforms. Theoretical results are matched against real ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
This paper provides a detailed analysis of various implementations of digital tries, including the “ternary search tries” of Bentley and Sedgewick. The methods employed combine symbolic uses of generating functions, Poisson models, and MeIlin transforms. Theoretical results are matched against reallife data and justify the claim that ternary search tries are a highly efficient dynamic dictionary structure for strings and textual data.
Dynamical Analysis of a Class of Euclidean Algorithms
"... We develop a general framework for the analysis of algorithms of a broad Euclidean type. The averagecase complexity of an algorithm is seen to be related to the analytic behaviour in the complex plane of the set of elementary transformations determined by the algorithm. The methods rely on properti ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
We develop a general framework for the analysis of algorithms of a broad Euclidean type. The averagecase complexity of an algorithm is seen to be related to the analytic behaviour in the complex plane of the set of elementary transformations determined by the algorithm. The methods rely on properties of transfer operators suitably adapted from dynamical systems theory. As a consequence, we obtain precise averagecase analyses of algorithms for evaluating the Jacobi symbol of computational number theory fame, thereby solving conjectures of Bach and Shallit. These methods also provide a unifying framework for the analysis of an entire class of gcdlike algorithms together with new results regarding the probable behaviour of their cost functions. 1
Average BitComplexity of Euclidean Algorithms
 Proceedings ICALP’00, Lecture Notes Comp. Science 1853, 373–387
, 2000
"... We obtain new results regarding the precise average bitcomplexity of five algorithms of a broad Euclidean type. We develop a general framework for analysis of algorithms, where the averagecase complexity of an algorithm is seen to be related to the analytic behaviour in the complex plane of the set ..."
Abstract

Cited by 15 (6 self)
 Add to MetaCart
We obtain new results regarding the precise average bitcomplexity of five algorithms of a broad Euclidean type. We develop a general framework for analysis of algorithms, where the averagecase complexity of an algorithm is seen to be related to the analytic behaviour in the complex plane of the set of elementary transformations determined by the algorithms. The methods rely on properties of transfer operators suitably adapted from dynamical systems theory and provide a unifying framework for the analysis of an entire class of gcdlike algorithms. Keywords: Averagecase Analysis of algorithms, BitComplexity, Euclidean Algorithms, Dynamical Systems, Ruelle operators, Generating Functions, Dirichlet Series, Tauberian Theorems. 1 Introduction Motivations. Euclid's algorithm was analysed first in the worst case in 1733 by de Lagny, then in the averagecase around 1969 independently by Heilbronn [12] and Dixon [6], and finally in distribution by Hensley [13] who proved in 1994 that the Eu...
Digits and Continuants in Euclidean Algorithms. Ergodic versus Tauberian Theorems
, 2000
"... We obtain new results regarding the precise average case analysis of the main quantities that intervene in algorithms of a broad Euclidean type. We develop a general framework for the analysis of such algorithms, where the averagecase complexity of an algorithm is related to the analytic behaviou ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
We obtain new results regarding the precise average case analysis of the main quantities that intervene in algorithms of a broad Euclidean type. We develop a general framework for the analysis of such algorithms, where the averagecase complexity of an algorithm is related to the analytic behaviour in the complex plane of the set of elementary transformations determined by the algorithms. The methods rely on properties of transfer operators suitably adapted from dynamical systems theory and provide a unifying framework for the analysis of the main parameters digits and continuants that intervene in an entire class of gcdlike algorithms. We operate a general transfer from the continuous case (Continued Fraction Algorithms) to the discrete case (Euclidean Algorithms), where Ergodic Theorems are replaced by Tauberian Theorems.
Continued Fractions, Comparison Algorithms, and Fine Structure Constants
, 2000
"... There are known algorithms based on continued fractions for comparing fractions and for determining the sign of 2x2 determinants. The analysis of such extremely simple algorithms leads to an incursion into a surprising variety of domains. We take the reader through a light tour of dynamical systems ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
There are known algorithms based on continued fractions for comparing fractions and for determining the sign of 2x2 determinants. The analysis of such extremely simple algorithms leads to an incursion into a surprising variety of domains. We take the reader through a light tour of dynamical systems (symbolic dynamics), number theory (continued fractions), special functions (multiple zeta values), functional analysis (transfer operators), numerical analysis (series acceleration), and complex analysis (the Riemann hypothesis). These domains all eventually contribute to a detailed characterization of the complexity of comparison and sorting algorithms, either on average or in probability.
Size and Path length of Patricia Tries: Dynamical Sources Context.
, 2001
"... Digital trees, also known as tries, and Patricia tries are flexible data structures that occur in a variety of computer and communication algorithms including dynamic hashing, partial match retrieval, searching and sorting, conflict resolution algorithms for broadcast communication, data compression ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Digital trees, also known as tries, and Patricia tries are flexible data structures that occur in a variety of computer and communication algorithms including dynamic hashing, partial match retrieval, searching and sorting, conflict resolution algorithms for broadcast communication, data compression, and so forth. We consider here tries and Patricia tries built from $n$ words emitted by a probabilistic dynamical source. Such sources encompass classical and many more models of sources as memoryless sources and finite Markov chains. The probabilistic behavior of the main parameters, namely the size and path length, appears to be determined by some intrinsic characteristics of the source, namely the entropy and two other constants, themselves related in a natural way to spectral properties of specific transfer operators of Ruelle type. Keywords: Averagecase Analysis of datastructures, Information Theory, Trie, Mellin analysis, Dynamical systems, Ruelle operator, Functional Analysis.
Universal Asymptotics for Random Tries and Patricia Trees
 Algorithmica
, 2004
"... We consider random tries and random patricia trees constructed from n independent strings of symbols drawn from any distribution on any discrete space. We show that many parameters Z_n of these random structures are universally stable in the sense that Z_n/E{Z_n} tends to one probability. This occur ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
We consider random tries and random patricia trees constructed from n independent strings of symbols drawn from any distribution on any discrete space. We show that many parameters Z_n of these random structures are universally stable in the sense that Z_n/E{Z_n} tends to one probability. This occurs, for example, when Z_n is the height, the size, the depth of the last node added, the number of nodes at a given depth (also called the profile), the search time for a partial match, the stack size, or the number of nodes with k children. These properties are valid without any conditions on the string distributions.