Results 1  10
of
346
Synchronization and linearity: an algebra for discrete event systems
, 2001
"... The first edition of this book was published in 1992 by Wiley (ISBN 0 471 93609 X). Since this book is now out of print, and to answer the request of several colleagues, the authors have decided to make it available freely on the Web, while retaining the copyright, for the benefit of the scientific ..."
Abstract

Cited by 372 (11 self)
 Add to MetaCart
The first edition of this book was published in 1992 by Wiley (ISBN 0 471 93609 X). Since this book is now out of print, and to answer the request of several colleagues, the authors have decided to make it available freely on the Web, while retaining the copyright, for the benefit of the scientific community. Copyright Statement This electronic document is in PDF format. One needs Acrobat Reader (available freely for most platforms from the Adobe web site) to benefit from the full interactive machinery: using the package hyperref by Sebastian Rahtz, the table of contents and all LATEX crossreferences are automatically converted into clickable hyperlinks, bookmarks are generated automatically, etc.. So, do not hesitate to click on references to equation or section numbers, on items of thetableofcontents and of the index, etc.. One may freely use and print this document for one’s own purpose or even distribute it freely, but not commercially, provided it is distributed in its entirety and without modifications, including this preface and copyright statement. Any use of thecontents should be acknowledged according to the standard scientific practice. The
The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms
 Russian Math. Surveys
, 1970
"... In 1964 Kolmogorov introduced the concept of the complexity of a finite object (for instance, the words in a certain alphabet). He defined complexity as the minimum number of binary signs containing all the information about a given object that are sufficient for its recovery (decoding). This defini ..."
Abstract

Cited by 240 (1 self)
 Add to MetaCart
(Show Context)
In 1964 Kolmogorov introduced the concept of the complexity of a finite object (for instance, the words in a certain alphabet). He defined complexity as the minimum number of binary signs containing all the information about a given object that are sufficient for its recovery (decoding). This definition depends essentially on the method of decoding. However, by means of the general theory of algorithms, Kolmogorov was able to give an invariant (universal) definition of complexity. Related concepts were investigated by Solotionoff (U.S.A.) and Markov. Using the concept of complexity, Kolmogorov gave definitions of the quantity of information in finite objects and of the concept of a random sequence (which was then defined more precisely by MartinLof). Afterwards, this circle of questions developed rapidly. In particular, an interesting development took place of the ideas of Markov on the application of the concept of complexity to the study of quantitative questions in the theory of algorithms. The present article is a survey of the fundamental results connected with the brief remarks above.
Dimension in Complexity Classes
 SIAM Journal on Computing
, 2000
"... A theory of resourcebounded dimension is developed using gales, which are natural generalizations of martingales. When the resource bound (a parameter of the theory) is unrestricted, the resulting dimension is precisely the classical Hausdorff dimension (sometimes called "fractal dimension&qu ..."
Abstract

Cited by 108 (16 self)
 Add to MetaCart
(Show Context)
A theory of resourcebounded dimension is developed using gales, which are natural generalizations of martingales. When the resource bound (a parameter of the theory) is unrestricted, the resulting dimension is precisely the classical Hausdorff dimension (sometimes called "fractal dimension"). Other choices of the parameter yield internal dimension theories in E, E 2 , ESPACE, and other complexity classes, and in the class of all decidable problems. In general, if C is such a class, then every set X of languages has a dimension in C, which is a real number dim(X j C) 2 [0; 1]. Along with the elements of this theory, two preliminary applications are presented: 1. For every real number 0 1 2 , the set FREQ( ), consisting of all languages that asymptotically contain at most of all strings, has dimension H()  the binary entropy of  in E and in E 2 . 2. For every real number 0 1, the set SIZE( 2 n n ), consisting of all languages decidable by Boolean circuits of at most 2 n n gates, has dimension in ESPACE.
The Dimensions of Individual Strings and Sequences
 INFORMATION AND COMPUTATION
, 2003
"... A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary ..."
Abstract

Cited by 101 (11 self)
 Add to MetaCart
(Show Context)
A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary) sequence S a dimension, which is a real number dim(S) in the interval [0, 1]. Sequences that
Ergodic theory on Galton–Watson trees: speed of random walk and dimension of harmonic measure. Ergodic Theory Dynam
 Systems
, 1995
"... ..."
Kolmogorov Complexity and Hausdorff Dimension
 Inform. and Comput
, 1989
"... this paper we are mainly interested in the first order approximation (i.e. the linear growth) of K(fi=\Delta). We consider the functions ..."
Abstract

Cited by 67 (21 self)
 Add to MetaCart
(Show Context)
this paper we are mainly interested in the first order approximation (i.e. the linear growth) of K(fi=\Delta). We consider the functions
Positivity of entropy production in nonequilibrium statistical mechanics
, 1996
"... ..."
(Show Context)
A Generalized Suffix Tree and Its (Un)Expected Asymptotic Behaviors
 SIAM J. Computing
, 1996
"... Suffix trees find several applications in computer science and telecommunications, most notably in algorithms on strings, data compressions and codes. Despite this, very little is known about their typical behaviors. In a probabilistic framework, we consider a family of suffix trees  further calle ..."
Abstract

Cited by 56 (30 self)
 Add to MetaCart
(Show Context)
Suffix trees find several applications in computer science and telecommunications, most notably in algorithms on strings, data compressions and codes. Despite this, very little is known about their typical behaviors. In a probabilistic framework, we consider a family of suffix trees  further called bsuffix trees  built from the first n suffixes of a random word. In this family a noncompact suffix tree (i.e., such that every edge is labeled by a single symbol) is represented by b = 1, and a compact suffix tree (i.e., without unary nodes) is asymptotically equivalent to b ! 1 as n ! 1. We study several parameters of bsuffix trees, namely: the depth of a given suffix, the depth of insertion, the height and the shortest feasible path. Some new results concerning typical (i.e., almost sure) behaviors of these parameters are established. These findings are used to obtain several insights into certain algorithms on words, molecular biology and universal data compression schemes. Key Wo...
Local stability of ergodic averages
 Transactions of the American Mathematical Society
"... We consider the extent to which one can compute bounds on the rate of convergence of a sequence of ergodic averages. It is not difficult to construct an example of a computable Lebesguemeasure preserving transformation of [0, 1] and a characteristic function f = χA such that the ergodic averages An ..."
Abstract

Cited by 48 (10 self)
 Add to MetaCart
(Show Context)
We consider the extent to which one can compute bounds on the rate of convergence of a sequence of ergodic averages. It is not difficult to construct an example of a computable Lebesguemeasure preserving transformation of [0, 1] and a characteristic function f = χA such that the ergodic averages Anf do not converge to a computable element of L2([0,1]). In particular, there is no computable bound on the rate of convergence for that sequence. On the other hand, we show that, for any nonexpansive linear operator T on a separable Hilbert space, and any element f, it is possible to compute a bound on the rate of convergence of (Anf) from T, f, and the norm ‖f ∗ ‖ of the limit. In particular, if T is the Koopman operator arising from a computable ergodic measure preserving transformation of a probability space X and f is any computable element of L2(X), then there is a computable bound on the rate of convergence of the sequence (Anf). The mean ergodic theorem is equivalent to the assertion that for every function K(n) and every ε> 0, there is an n with the property that the ergodic averages Amf are stable to within ε on the interval [n, K(n)]. Even in situations where the sequence (Anf) does not have a computable limit, one can give explicit bounds on such n in terms of K and ‖f‖/ε. This tells us how far one has to search to find an n so that the ergodic averages are “locally stable ” on a large interval. We use these bounds to obtain a similarly explicit version of the pointwise ergodic theorem, and show that our bounds are qualitatively different from ones that can be obtained using upcrossing inequalities due to Bishop and Ivanov. Finally, we explain how our positive results can be viewed as an application of a body of general prooftheoretic methods falling under the heading of “proof mining.” 1