Results 1 
8 of
8
Medvedev degrees of 2dimensional subshifts of finite type. Ergodic Theory and Dynamical Systems
"... In this paper we apply some fundamental concepts and results from recursion theory in order to obtain an apparently new counterexample in symbolic dynamics. Two sets X and Y are said to be Medvedev equivalent if there exist partial recursive functionals from X into Y and vice versa. The Medvedev deg ..."
Abstract

Cited by 18 (9 self)
 Add to MetaCart
In this paper we apply some fundamental concepts and results from recursion theory in order to obtain an apparently new counterexample in symbolic dynamics. Two sets X and Y are said to be Medvedev equivalent if there exist partial recursive functionals from X into Y and vice versa. The Medvedev degree of X is the equivalence class of X under Medvedev equivalence. There is an extensive recursiontheoretic literature on the lattice of Medvedev degrees of nonempty Π 0 1 subsets of {0, 1} N. This lattice is known as Ps. We prove that Ps consists precisely of the Medvedev degrees of 2dimensional subshifts of finite type. We use this result to obtain an infinite collection of 2dimensional subshifts of finite type which are, in a certain sense, mutually incompatible. Definition 1. Let A be a finite set of symbols. The full 2dimensional shift on A is the dynamical system consisting of the natural action of Z2 on the compact set AZ2. A 2dimensional subshift is a nonempty closed set X ⊆ AZ2 which is invariant under the action of Z2. A 2dimensional subshift X is said to be of finite type if it is defined by a finite set of forbidden configurations. An interesting paper on 2dimensional subshifts of finite type is Mozes [22]. A standard reference for the 1dimensional case is the book of Lind/Marcus [20], which also includes an appendix [20, §13.10] on the 2dimensional case.
Algorithmically Independent Sequences
, 2008
"... Two objects are independent if they do not affect each other. Independence is wellunderstood in classical information theory, but less in algorithmic information theory. Working in the framework of algorithmic information theory, the paper proposes two types of independence for arbitrary infinite bi ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Two objects are independent if they do not affect each other. Independence is wellunderstood in classical information theory, but less in algorithmic information theory. Working in the framework of algorithmic information theory, the paper proposes two types of independence for arbitrary infinite binary sequences and studies their properties. Our two proposed notions of independence have some of the intuitive properties that one naturally expects. For example, for every sequence x, the set of sequences that are independent (in the weaker of the two senses) with x has measure one. For both notions of independence we investigate to what extent pairs of independent sequences, can be effectively constructed via Turing reductions (from one or more input sequences). In this respect, we prove several impossibility results. For example, it is shown that there is no effective way of producing from an arbitrary sequence with positive constructive Hausdorff dimension two sequences that are independent (even in the weaker type of independence) and have superlogarithmic complexity. Finally, a few conjectures and open questions are discussed.
Symbolic dynamics: entropy = dimension = complexity
"... Let d be a positive integer. Let G be the additive monoid N d or the additive group Z d. Let A be a finite set of symbols. The shift action of G on A G is given by S g (x)(h) = x(g + h) for all g,h ∈ G and all x ∈ A G. A Gsubshift is defined to be a nonempty closed set X ⊆ A G such that S g (x) ∈ ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Let d be a positive integer. Let G be the additive monoid N d or the additive group Z d. Let A be a finite set of symbols. The shift action of G on A G is given by S g (x)(h) = x(g + h) for all g,h ∈ G and all x ∈ A G. A Gsubshift is defined to be a nonempty closed set X ⊆ A G such that S g (x) ∈ X for all g ∈ G and all x ∈ X. Given a Gsubshift X, the topological entropy ent(X) is defined as usual [31]. The standard metric on A G is defined by ρ(x,y) = 2 −Fn  where n is as large as possible such that x↾Fn = y↾Fn. Here Fn = {0,1,...,n} d if G = N d, and Fn = {−n,...,−1,0,1,...,n} d if G = Z d. For any X ⊆ A G the Hausdorff dimension dim(X) and the effective Hausdorff dimension effdim(X) are defined as usual [14, 26, 27] with respect to the standard metric. It is well known that effdim(X) = sup x∈X liminfnK(x↾Fn)/Fn  where K denotes Kolmogorov complexity [9]. If X is a Gsubshift, we prove that ent(X) = dim(X) = effdim(X), and ent(X) ≥ limsup n K(x↾Fn)/Fn  for all x ∈ X, and ent(X) = limnK(x↾Fn)/Fn  for some x ∈ X.
Impossibility of independence amplification in Kolmogorov Complexity Theory
"... The paper studies randomness extraction from sources with bounded independence and the issue of independence amplification of sources, using the framework of Kolmogorov complexity. The dependency of strings x and y is dep(x, y) = max{C(x) − C(x  y), C(y) − C(y  x)}, where C(·) denotes the Kol ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
The paper studies randomness extraction from sources with bounded independence and the issue of independence amplification of sources, using the framework of Kolmogorov complexity. The dependency of strings x and y is dep(x, y) = max{C(x) − C(x  y), C(y) − C(y  x)}, where C(·) denotes the Kolmogorov complexity. It is shown that there exists a computable Kolmogorov extractor f such that, for any two nbit strings with complexity s(n) and dependency α(n), it outputs a string of length s(n) with complexity s(n) − α(n) conditioned by any one of the input strings. It is proven that the above are the optimal parameters a Kolmogorov extractor can achieve. It is shown that independence amplification cannot be effectively realized. Specifically, if (after excluding a trivial case) there exist computable functions f1 and f2 such that dep(f1(x, y), f2(x, y)) ≤ β(n) for all nbit strings x and y with dep(x, y) ≤ α(n), then β(n) ≥ α(n) − O(log n).
Randomness, Computation and Mathematics
"... Abstract. This article examines some of the recent advances in our understanding of algorithmic randomness. It also discusses connections with various areas of mathematics, computer science and other areas of science. Some questions and speculations will be discussed. 1 ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. This article examines some of the recent advances in our understanding of algorithmic randomness. It also discusses connections with various areas of mathematics, computer science and other areas of science. Some questions and speculations will be discussed. 1
Propagation of partial randomness
"... Let f be a computable function from finite sequences of 0’s and 1’s to real numbers. We prove that strong frandomness implies strong frandomness relative to a PAdegree. We also prove: if X is strongly frandom and Turing reducible to Y where Y is MartinLöf random relative to Z, then X is strongl ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Let f be a computable function from finite sequences of 0’s and 1’s to real numbers. We prove that strong frandomness implies strong frandomness relative to a PAdegree. We also prove: if X is strongly frandom and Turing reducible to Y where Y is MartinLöf random relative to Z, then X is strongly frandom relative to Z. In addition, we prove analogous propagation results for other notions of partial randomness, including nonKtriviality and autocomplexity. We prove that frandomness relative to a PAdegree implies strong frandomness, but frandomness does not imply frandomness relative to a PAdegree. Keywords: partial randomness, effective Hausdorff dimension, MartinLöf randomness, Kolmogorov complexity, models of arithmetic.
Computability Theory, Algorithmic Randomness and Turing’s Anticipation
"... This article looks at the applications of Turing’s Legacy in computation, particularly to the theory of algorithmic randomness, where classical mathematical concepts such as measure could be made computational. It also traces Turing’s anticipation of this theory in an early manuscript. ..."
Abstract
 Add to MetaCart
This article looks at the applications of Turing’s Legacy in computation, particularly to the theory of algorithmic randomness, where classical mathematical concepts such as measure could be made computational. It also traces Turing’s anticipation of this theory in an early manuscript.