Results 1 
8 of
8
Two sources are better than one for increasing the Kolmogorov complexity of infinite sequences
, 2007
"... ..."
Errorcorrecting codes and phase transitions
"... Abstract. The theory of errorcorrecting codes is concerned with constructing codes that optimize simultaneously transmission rate and relative minimum distance. These conflicting requirements determine an asymptotic bound, which is a continuous curve in the space of parameters. The main goal of thi ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract. The theory of errorcorrecting codes is concerned with constructing codes that optimize simultaneously transmission rate and relative minimum distance. These conflicting requirements determine an asymptotic bound, which is a continuous curve in the space of parameters. The main goal of this paper is to relate the asymptotic bound to phase diagrams of quantum statistical mechanical systems. We first identify the code parameters with Hausdorff and von Neumann dimensions, by considering fractals consisting of infinite sequences of code words. We then construct operator algebras associated to individual codes. These are Toeplitz algebras with a time evolution for which the KMS state at critical temperature gives the Hausdorff measure on the corresponding fractal. We extend this construction to algebras associated to limit points of codes, with nonuniform multifractal measures, and to tensor products over varying parameters. Contents. 0. Introduction: asymptotic bounds
Schnorr dimension
 in exponential time, Computational Complexity 2001, 210217, IEEE Computer Society
, 2001
"... ABSTRACT. Following Lutz’s approach to effective (constructive) dimension, we define a notion of dimension for individual sequences based on Schnorr’s concept(s) of randomness. In contrast to computable randomness and Schnorr randomness, the dimension concepts defined via computable martingales and ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
ABSTRACT. Following Lutz’s approach to effective (constructive) dimension, we define a notion of dimension for individual sequences based on Schnorr’s concept(s) of randomness. In contrast to computable randomness and Schnorr randomness, the dimension concepts defined via computable martingales and Schnorr tests coincide, i.e. the Schnorr Hausdorff dimension of a sequence always equals its computable Hausdorff dimension. Furthermore, we give a machine characterization of Schnorr dimension, based on prefixfree machines whose domain has computable measure. Finally, we show that there exist computably enumerable sets which are Schnorr (computably) irregular: while every c.e. set has Schnorr Hausdorff dimension 0 there are c.e. sets of computable packing dimension 1, a property impossible in the case of effective (constructive) dimension, due to Barzdiņˇs’ Theorem. In fact, we prove that every hyperimmune Turing degree contains a set of computable packing dimension 1. 1.
Algorithmically Independent Sequences
, 2008
"... Two objects are independent if they do not affect each other. Independence is wellunderstood in classical information theory, but less in algorithmic information theory. Working in the framework of algorithmic information theory, the paper proposes two types of independence for arbitrary infinite bi ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Two objects are independent if they do not affect each other. Independence is wellunderstood in classical information theory, but less in algorithmic information theory. Working in the framework of algorithmic information theory, the paper proposes two types of independence for arbitrary infinite binary sequences and studies their properties. Our two proposed notions of independence have some of the intuitive properties that one naturally expects. For example, for every sequence x, the set of sequences that are independent (in the weaker of the two senses) with x has measure one. For both notions of independence we investigate to what extent pairs of independent sequences, can be effectively constructed via Turing reductions (from one or more input sequences). In this respect, we prove several impossibility results. For example, it is shown that there is no effective way of producing from an arbitrary sequence with positive constructive Hausdorff dimension two sequences that are independent (even in the weaker type of independence) and have superlogarithmic complexity. Finally, a few conjectures and open questions are discussed.
On Oscillationfree εrandom Sequences
, 2008
"... In this paper we discuss three notions of partial randomness or εrandomness. εrandomness should display all features of randomness in a scaled down manner. However, as Reimann and Stephan [15] proved, Tadaki [22] and Calude et al. [3] proposed at least three different concepts of partial randomnes ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper we discuss three notions of partial randomness or εrandomness. εrandomness should display all features of randomness in a scaled down manner. However, as Reimann and Stephan [15] proved, Tadaki [22] and Calude et al. [3] proposed at least three different concepts of partial randomness. We show that all of them satisfy the natural requirement that any εnonnull set contains an εrandom infinite word. This allows us to focus our investigations on the strongest one which is based on a priori complexity. We investigate this concept of partial randomness and show that it allows—similar to the random infinite words—oscillationfree (w.r.t. to a priori complexity) εrandom infinite words if only ε is a computable number. The proof uses the dilution principle. Alternatively, for certain sets of infinite words (ωlanguages) we show that their most complex infinite words are oscillationfree εrandom. Here the parameter ε is also computable and depends on the set chosen.
Algorithmically Independent Sequences
, 802
"... Two objects are independent if they do not affect each other. Independence is wellunderstood in classical information theory, but less in algorithmic information theory. Working in the framework of algorithmic information theory, the paper proposes two types of independence for arbitrary infinite bi ..."
Abstract
 Add to MetaCart
Two objects are independent if they do not affect each other. Independence is wellunderstood in classical information theory, but less in algorithmic information theory. Working in the framework of algorithmic information theory, the paper proposes two types of independence for arbitrary infinite binary sequences and studies their properties. Our two proposed notions of independence have some of the intuitive properties that one naturally expects. For example, for every sequence x, the set of sequences that are independent (in the weaker of the two senses) with x has measure one. For both notions of independence we investigate to what extent pairs of independent sequences, can be effectively constructed via Turing reductions (from one or more input sequences). In this respect, we prove several impossibility results. For example, it is shown that there is no effective way of producing from an arbitrary sequence with positive constructive Hausdorff dimension two sequences that are independent (even in the weaker type of independence) and have superlogarithmic complexity. Finally, a few conjectures and open questions are discussed. 1