Results 1  10
of
24
Recursively Enumerable Reals and Chaitin Ω Numbers
"... A real is called recursively enumerable if it is the limit of a recursive, increasing, converging sequence of rationals. Following Solovay [23] and Chaitin [10] we say that an r.e. real dominates an r.e. real if from a good approximation of from below one can compute a good approximation of from b ..."
Abstract

Cited by 34 (3 self)
 Add to MetaCart
A real is called recursively enumerable if it is the limit of a recursive, increasing, converging sequence of rationals. Following Solovay [23] and Chaitin [10] we say that an r.e. real dominates an r.e. real if from a good approximation of from below one can compute a good approximation of from below. We shall study this relation and characterize it in terms of relations between r.e. sets. Solovay's [23]like numbers are the maximal r.e. real numbers with respect to this order. They are random r.e. real numbers. The halting probability ofa universal selfdelimiting Turing machine (Chaitin's Ω number, [9]) is also a random r.e. real. Solovay showed that any Chaitin Ω number islike. In this paper we show that the converse implication is true as well: any Ωlike real in the unit interval is the halting probability of a universal selfdelimiting Turing machine.
From Heisenberg to Gödel via Chaitin
, 2008
"... In 1927 Heisenberg discovered that the “more precisely the position is determined, the less precisely the momentum is known in this instant, and vice versa”. Four years later Gödel showed that a finitely specified, consistent formal system which is large enough to include arithmetic is incomplete. A ..."
Abstract

Cited by 11 (9 self)
 Add to MetaCart
In 1927 Heisenberg discovered that the “more precisely the position is determined, the less precisely the momentum is known in this instant, and vice versa”. Four years later Gödel showed that a finitely specified, consistent formal system which is large enough to include arithmetic is incomplete. As both results express some kind of impossibility it is natural to ask whether there is any relation between them, and, indeed, this question has been repeatedly asked for a long time. The main interest seems to have been in possible implications of incompleteness to physics. In this note we will take interest in the converse implication and will offer a positive answer to the question: Does uncertainty imply incompleteness? We will show that algorithmic randomness is equivalent to a “formal uncertainty principle ” which implies Chaitin’s informationtheoretic incompleteness. We also show that the derived uncertainty relation, for many computers, is physical. This fact supports the conjecture that uncertainty implies randomness not only in mathematics, but also in physics.
Relations between varieties of Kolmogorov complexity
 Mathematical Systems Theory
, 1996
"... Abstract. There are several sorts of Kolmogorov complexity, better to say several Kolmogorov complexities: decision complexity, simple complexity, prefix complexity, monotonic complexity, a priori complexity. The last three can and the first two cannot be used for defining randomness of an infinite ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Abstract. There are several sorts of Kolmogorov complexity, better to say several Kolmogorov complexities: decision complexity, simple complexity, prefix complexity, monotonic complexity, a priori complexity. The last three can and the first two cannot be used for defining randomness of an infinite binary sequence. All those five versions of Kolmogorov complexity were considered, from a unified point of view, in a paper by the first author which appeared in Watanabe’s book [23]. Upper and lower bounds for those complexities and also for their differences were announced in that paper without proofs. (Some of those bounds are mentioned in Section 4.4.5 of [16].) The purpose of this paper (which can be read independently of [23]) is to give proofs for the bounds from [23]. The terminology used in this paper is somehow nonstandard: we call “Kolmogorov entropy ” what is usually called “Kolmogorov complexity. ” This is a Moscow tradition suggested by Kolmogorov himself. By this tradition the term “complexity ” relates to any mode of description and “entropy ” is the complexity related to an optimal mode (i.e., to a mode that, roughly speaking, gives the shortest descriptions).
Orbit complexity, initial data sensitivity and weakly chaotic dynamical systems
, 2008
"... We give a definition of generalized indicators of sensitivity to initial conditions and orbit complexity (a measure of the information that is necessary to describe the orbit of a given point). The well known RuellePesin and BrinKatok theorems, combined with Brudno’s theorem give a relation betwee ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
We give a definition of generalized indicators of sensitivity to initial conditions and orbit complexity (a measure of the information that is necessary to describe the orbit of a given point). The well known RuellePesin and BrinKatok theorems, combined with Brudno’s theorem give a relation between initial data sensitivity and orbit complexity that is generalized in the present work. The generalized relation implies that the set of points where the sensitivity to initial conditions is more than exponential in all directions is a 0 dimensional set. The generalized relation is then applied to the study of an important example of weakly chaotic dynamics: the Manneville map.
Compression and diffusion: a joint approach to detect complexity
 Chaos, Solitons & Fractals
, 2003
"... The adoption of the KolmogorovSinai (KS) entropy is becoming a popular research tool among physicists, especially when applied to a dynamical system fitting the conditions of validity of the Pesin theorem. The study of time series that are a manifestation of system dynamics whose rules are either u ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
The adoption of the KolmogorovSinai (KS) entropy is becoming a popular research tool among physicists, especially when applied to a dynamical system fitting the conditions of validity of the Pesin theorem. The study of time series that are a manifestation of system dynamics whose rules are either unknown or too complex for a mathematical treatment, is still a challenge since the KS entropy is not computable, in general, in that case. Here we present a plan of action based on the joint action of two procedures, both related to the KS entropy, but compatible with computer implementation through fast and efficient programs. The former procedure, called Compression Algorithm Sensitive To Regularity (CASToRe), establishes the amount of order by the numerical evaluation of algorithmic compressibility. The latter, called Complex Analysis of Sequences via Scaling AND Randomness Assessment (CASSANDRA), establishes the complexity degree through the numerical evaluation of the
The Manneville map: topological, metric and algorithmic entropy
, 2001
"... We study the Manneville map f(x) = x + x z (mod 1), with z> 1, from a computational point of view, studying the behaviour of the Algorithmic Information Content. In particular, we consider a family of piecewise linear maps that gives examples of algorithmic behaviour ranging from the fully to th ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
We study the Manneville map f(x) = x + x z (mod 1), with z> 1, from a computational point of view, studying the behaviour of the Algorithmic Information Content. In particular, we consider a family of piecewise linear maps that gives examples of algorithmic behaviour ranging from the fully to the mildly chaotic, and show that the Manneville map is a member of this family.
and local complexity in weakly chaotic dynamical systems, ArXiv math.DS/0210378
"... In a topological dynamical system the complexity of an orbit is a measure of the amount of information (algorithmic information content) that is necessary to describe the orbit. This indicator is invariant up to topological conjugation. We consider this indicator of local complexity of the dynamics ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
In a topological dynamical system the complexity of an orbit is a measure of the amount of information (algorithmic information content) that is necessary to describe the orbit. This indicator is invariant up to topological conjugation. We consider this indicator of local complexity of the dynamics and provide different examples of its behavior, showing how it can be useful to characterize various kind of weakly chaotic dynamics. We also provide criteria to find systems with non trivial orbit complexity (systems where the description of the whole orbit requires an infinite amount of information). We consider also a global indicator of the complexity of the system. This global indicator generalizes the topological entropy, taking into account systems were the number of essentially different orbits increases less than exponentially. Then we prove that if the system is constructive (roughly speaking: if the map can be defined up to any given accuracy using a
Measure of uncertainty and information
 Imprecise Probability Project,1999 (http://ippserv.rug.ac.be/home/ipp.html
, 1999
"... Abstract. This contribution overviews the approaches, results and history of attempts at measuring uncertainty and information in the various theories of imprecise probabilities. The main focus, however, is on the theory of belief functions (or the DempsterShafer theory) [62] and the possibility th ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract. This contribution overviews the approaches, results and history of attempts at measuring uncertainty and information in the various theories of imprecise probabilities. The main focus, however, is on the theory of belief functions (or the DempsterShafer theory) [62] and the possibility theory [7] as most of the development so far has happened there. Due to the limited space I am focusing on the main ideas and point to references for details. There are
Looking From the Inside and From the Outside
, 1998
"... Many times in mathematics there is a natural dichotomy between describing some object from the inside and from the outside. Imagine algebraic varieties for instance; they can be described from the outside as solution sets of polynomial equations, but one can also try to understand how it is for ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Many times in mathematics there is a natural dichotomy between describing some object from the inside and from the outside. Imagine algebraic varieties for instance; they can be described from the outside as solution sets of polynomial equations, but one can also try to understand how it is for actual points to move around inside them, perhaps to parameterize them in some way. The concept of formal proofs has the interesting feature that it provides opportunities for both perspectives. The inner perspective has been largely overlooked, but in fact lengths of proofs lead to new ways to measure the information content of mathematical objects. The disparity between minimal lengths of proofs with and without "lemmas" provides an indication of internal symmetry of mathematical objects and their descriptions.
Grigolini P., “Vortex Dynamics in evolutive flows: a weakly chaotic phenomenon
 Physical Review E
, 2003
"... ..."