Results 1  10
of
13
Recurrence, Dimensions And Lyapunov Exponents
, 2001
"... We show that the Poincaré return time of a typical cylinder is at least its length. For one dimensional maps we express the Lyapunov exponent and dimension via return times. ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
We show that the Poincaré return time of a typical cylinder is at least its length. For one dimensional maps we express the Lyapunov exponent and dimension via return times.
Orbit complexity, initial data sensitivity and weakly chaotic dynamical systems.Preprint
"... We give a definition of generalized indicators of sensitivity to initial conditions and orbit complexity (a measure of the information that is necessary to describe the orbit of a given point). The well known RuellePesin and BrinKatok theorems, combined with Brudno’s theorem give a relation betwee ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
We give a definition of generalized indicators of sensitivity to initial conditions and orbit complexity (a measure of the information that is necessary to describe the orbit of a given point). The well known RuellePesin and BrinKatok theorems, combined with Brudno’s theorem give a relation between initial data sensitivity and orbit complexity that is generalized in the present work. The generalized relation implies that the set of points where the sensitivity to initial conditions is more than exponential in all directions is a 0 dimensional set. The generalized relation is then applied to the study of an important example of weakly chaotic dynamics: the Manneville map. 1
Effective Hausdorff dimension
 In Logic Colloquium ’01
, 2005
"... ABSTRACT. We continue the study of effective Hausdorff dimension as it was initiated by LUTZ. Whereas he uses a generalization of martingales on the Cantor space to introduce this notion we give a characterization in terms of effective sdimensional Hausdorff measures, similar to the effectivization ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
ABSTRACT. We continue the study of effective Hausdorff dimension as it was initiated by LUTZ. Whereas he uses a generalization of martingales on the Cantor space to introduce this notion we give a characterization in terms of effective sdimensional Hausdorff measures, similar to the effectivization of Lebesgue measure by MARTINLÖF. It turns out that effective Hausdorff dimension allows to classify sequences according to their ‘degree ’ of algorithmic randomness, i.e., their algorithmic density of information. Earlier the works of STAIGER and RYABKO showed a deep connection between Kolmogorov complexity and Hausdorff dimension. We further develop this relationship and use it to give effective versions of some important properties of (classical) Hausdorff dimension. Finally, we determine the effective dimension of some objects arising in the context of computability theory, such as degrees and spans. 1.
Symbolic dynamics: entropy = dimension = complexity
"... Let d be a positive integer. Let G be the additive monoid N d or the additive group Z d. Let A be a finite set of symbols. The shift action of G on A G is given by S g (x)(h) = x(g + h) for all g,h ∈ G and all x ∈ A G. A Gsubshift is defined to be a nonempty closed set X ⊆ A G such that S g (x) ∈ ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Let d be a positive integer. Let G be the additive monoid N d or the additive group Z d. Let A be a finite set of symbols. The shift action of G on A G is given by S g (x)(h) = x(g + h) for all g,h ∈ G and all x ∈ A G. A Gsubshift is defined to be a nonempty closed set X ⊆ A G such that S g (x) ∈ X for all g ∈ G and all x ∈ X. Given a Gsubshift X, the topological entropy ent(X) is defined as usual [31]. The standard metric on A G is defined by ρ(x,y) = 2 −Fn  where n is as large as possible such that x↾Fn = y↾Fn. Here Fn = {0,1,...,n} d if G = N d, and Fn = {−n,...,−1,0,1,...,n} d if G = Z d. For any X ⊆ A G the Hausdorff dimension dim(X) and the effective Hausdorff dimension effdim(X) are defined as usual [14, 26, 27] with respect to the standard metric. It is well known that effdim(X) = sup x∈X liminfnK(x↾Fn)/Fn  where K denotes Kolmogorov complexity [9]. If X is a Gsubshift, we prove that ent(X) = dim(X) = effdim(X), and ent(X) ≥ limsup n K(x↾Fn)/Fn  for all x ∈ X, and ent(X) = limnK(x↾Fn)/Fn  for some x ∈ X.
All Entropies Agree For An Sft
"... this paper I discuss a number of "entropies" which have definitions which are respectively probabilistic, topological, algebraic, and algorithmic. I shall explain how these entropies are all defined in the setting of shift dynamical systems. The main result of the paper, which should be regarded as ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
this paper I discuss a number of "entropies" which have definitions which are respectively probabilistic, topological, algebraic, and algorithmic. I shall explain how these entropies are all defined in the setting of shift dynamical systems. The main result of the paper, which should be regarded as part of the folklore, is the fact that for topologically transitive shifts of finite type (the definitions of these terms will be found below) all these entropies agree numerically.
What Is Information?
, 1995
"... this paper knows that Shannon did no such thing. It must not be forgotten that Shannon called his theory "a general theory of communication ", not a theory of information. The distinction is crucial. As Shannon put it in [15]: The fundamental problem of communication is that of reproducing at one po ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
this paper knows that Shannon did no such thing. It must not be forgotten that Shannon called his theory "a general theory of communication ", not a theory of information. The distinction is crucial. As Shannon put it in [15]: The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is, they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. It would be impossible to overstress the fact that all aspects of "information" other than statistical phenomena are completely irrelevant to communication theory.
COMPLEXITY CHARACTERIZAZION OF DYNAMICAL SYSTEMS THROUGH PREDICTABILITY ∗
, 2003
"... Some aspects of the predictability problem in dynamical systems are reviewed. The deep relation among Lyapunov exponents, KolmogorovSinai entropy, Shannon entropy and algorithmic complexity is discussed. In particular, we emphasize how a characterization of the unpredictability of a system gives a ..."
Abstract
 Add to MetaCart
Some aspects of the predictability problem in dynamical systems are reviewed. The deep relation among Lyapunov exponents, KolmogorovSinai entropy, Shannon entropy and algorithmic complexity is discussed. In particular, we emphasize how a characterization of the unpredictability of a system gives a measure of its complexity. A special attention is devoted to finiteresolution effects on predictability, which can be accounted with suitable generalization of the standard indicators. The problems involved in systems with intrinsic randomness is discussed, with emphasis on the important problems of distinguishing chaos from noise and of modeling the system. PACS numbers: PACS 45.05.+x, 05.45.a All the simple systems are simple in the same way, each complex system has its own complexity (freely inspired by Anna Karenina by Lev N. Tolstoy) 1.
Algorithmic information for intermittent systems with an indifferent fixed point
, 2008
"... Measuring the average information that is necessary to describe the behaviour of a dynamical system leads to a generalization of the KolmogorovSinai entropy. This is particularly interesting when the system has null entropy and the information increases less than linearly with respect to time. We c ..."
Abstract
 Add to MetaCart
Measuring the average information that is necessary to describe the behaviour of a dynamical system leads to a generalization of the KolmogorovSinai entropy. This is particularly interesting when the system has null entropy and the information increases less than linearly with respect to time. We consider two classes of maps of the interval with an indifferent fixed point at the origin and an infinite natural invariant measure. We calculate that the average information that is necessary to describe the behaviour of its orbits increases with time n approximately as n α, where α < 1 depends only on the asymptotic behaviour of the map near the origin. 1
COMPLEXITY FOR EXTENDED DYNAMICAL SYSTEMS
, 2006
"... Abstract. We consider dynamical systems for which the spatial extension plays an important role. For these systems, the notions of attractor, ǫentropy and topological entropy per unit time and volume have been introduced previously. In this paper we use the notion of Kolmogorov complexity to introd ..."
Abstract
 Add to MetaCart
Abstract. We consider dynamical systems for which the spatial extension plays an important role. For these systems, the notions of attractor, ǫentropy and topological entropy per unit time and volume have been introduced previously. In this paper we use the notion of Kolmogorov complexity to introduce, for extended dynamical systems, a notion of complexity per unit time and volume which plays the same role as the metric entropy for classical dynamical systems. We introduce this notion as an almost sure limit on orbits of the system. Moreover we prove a kind of variational principle for this complexity. 1.
unknown title
, 2008
"... Information, initial condition sensitivity and dimension in weakly chaotic dynamical systems ..."
Abstract
 Add to MetaCart
Information, initial condition sensitivity and dimension in weakly chaotic dynamical systems