Results 1  10
of
24
Two sources are better than one for increasing the Kolmogorov complexity of infinite sequences
, 2007
"... ..."
EXTRACTING INFORMATION IS HARD: A TURING DEGREE OF NONINTEGRAL EFFECTIVE HAUSDORFF DIMENSION
"... Abstract. We construct a ∆0 2 infinite binary sequence with effective Hausdorff dimension 1/2 that does not compute a sequence of higher dimension. Introduced by Lutz, effective Hausdorff dimension can be viewed as a measure of the information density of a sequence. In particular, the dimension of A ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
(Show Context)
Abstract. We construct a ∆0 2 infinite binary sequence with effective Hausdorff dimension 1/2 that does not compute a sequence of higher dimension. Introduced by Lutz, effective Hausdorff dimension can be viewed as a measure of the information density of a sequence. In particular, the dimension of A ∈ 2ω is the lim inf of the ratio between the information content and length of initial segments of A. Thus the main result demonstrates that it is not always possible to extract information from a partially random source to produce a sequence that has higher information density. 1.
Kolmogorov Complexity in Randomness Extraction
"... We clarify the role of Kolmogorov complexity in the area of randomness extraction. We show that a computable function is an almost randomness extractor if and only if it is a Kolmogorov complexity extractor, thus establishing a fundamental equivalence between two forms of extraction studied in the l ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
We clarify the role of Kolmogorov complexity in the area of randomness extraction. We show that a computable function is an almost randomness extractor if and only if it is a Kolmogorov complexity extractor, thus establishing a fundamental equivalence between two forms of extraction studied in the literature: Kolmogorov extraction and randomness extraction. We present a distribution M k based on Kolmogorov complexity that is complete for randomness extraction in the sense that a computable function is an almost randomness extractor if and only if it extracts randomness from M k.
EXTRACTING THE KOLMOGOROV COMPLEXITY OF STRINGS AND SEQUENCES FROM SOURCES WITH LIMITED INDEPENDENCE
"... An infinite binary sequence has randomness rate σ if, for almost every n, the Kolmogorov complexity of its prefix of length n is at least σn. It is known that for every rational σ ∈ (0, 1), on one hand, there exists sequences with randomness rate σ that can not be effectively transformed into a sequ ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
An infinite binary sequence has randomness rate σ if, for almost every n, the Kolmogorov complexity of its prefix of length n is at least σn. It is known that for every rational σ ∈ (0, 1), on one hand, there exists sequences with randomness rate σ that can not be effectively transformed into a sequence with randomness rate higher than σ and, on the other hand, any two independent sequences with randomness rate σ can be transformed into a sequence with randomness rate higher than σ. We show that the latter result holds even if the two input sequences have linear dependency (which, informally speaking, means that all prefixes of length n of the two sequences have in common a constant fraction of their information). The similar problem is studied for finite strings. It is shown that from any two strings with sufficiently large Kolmogorov complexity and sufficiently small dependence, one can effectively construct a string that is random even conditioned by any one of the input strings.
Impossibility of independence amplification in Kolmogorov Complexity Theory
"... The paper studies randomness extraction from sources with bounded independence and the issue of independence amplification of sources, using the framework of Kolmogorov complexity. The dependency of strings x and y is dep(x, y) = max{C(x) − C(x  y), C(y) − C(y  x)}, where C(·) denotes the Kol ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
The paper studies randomness extraction from sources with bounded independence and the issue of independence amplification of sources, using the framework of Kolmogorov complexity. The dependency of strings x and y is dep(x, y) = max{C(x) − C(x  y), C(y) − C(y  x)}, where C(·) denotes the Kolmogorov complexity. It is shown that there exists a computable Kolmogorov extractor f such that, for any two nbit strings with complexity s(n) and dependency α(n), it outputs a string of length s(n) with complexity s(n) − α(n) conditioned by any one of the input strings. It is proven that the above are the optimal parameters a Kolmogorov extractor can achieve. It is shown that independence amplification cannot be effectively realized. Specifically, if (after excluding a trivial case) there exist computable functions f1 and f2 such that dep(f1(x, y), f2(x, y)) ≤ β(n) for all nbit strings x and y with dep(x, y) ≤ α(n), then β(n) ≥ α(n) − O(log n).
Counting dependent and independent strings
 In Proceedings 35th MFCS
, 2010
"... Abstract. We derive quantitative results regarding sets of nbit strings that have different dependency or independency properties. Let C(x) be the Kolmogorov complexity of the string x. A string y has α dependency with a string x if C(y) − C(y  x) ≥ α. A set of strings {x1,..., xt} is pairwise α ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We derive quantitative results regarding sets of nbit strings that have different dependency or independency properties. Let C(x) be the Kolmogorov complexity of the string x. A string y has α dependency with a string x if C(y) − C(y  x) ≥ α. A set of strings {x1,..., xt} is pairwise αindependent if for all i = j, C(xi) − C(xi  xj) ≤ α. A tuple of strings (x1,..., xt) is mutually αindependent if C(xπ(1)... xπ(t)) ≥ C(x1) +... + C(xt) − α, for every permutation π of [t]. We show that: – For every nbit string x with complexity C(x) ≥ α + 7 log n, the set of nbit strings that have α dependency with x has size at least (1/poly(n))2 n−α. In case α is computable from n and C(x) ≥ α + 12 log n, the size of same set is at least (1/C)2 n−α − poly(n)2 α, for some positive constant C. – There exists a set of nbit strings A of size poly(n)2 α such that any nbit string has αdependency with some string in A. – If the set of nbit strings {x1,..., xt} is pairwise αindependent, then t ≤ poly(n)2 α. This bound is tight within a poly(n) factor, because, for every n, there exists a set of nbit strings {x1,..., xt} that is pairwise αdependent with t = (1/poly(n)) · 2 α (for all α ≥ 5 log n). – If the tuple of nbit strings (x1,..., xt) is mutually αindependent, then t ≤ poly(n)2 α (for all α ≥ 7 log n + 6). 1
Dimension extractors
, 2006
"... ddoty at iastate dot edu A dimension extractor is an algorithm designed to increase the effective dimension – i.e., the computational information density – of an infinite sequence. A constructive dimension extractor is exhibited by showing that every sequence of positive constructive dimension is Tu ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
ddoty at iastate dot edu A dimension extractor is an algorithm designed to increase the effective dimension – i.e., the computational information density – of an infinite sequence. A constructive dimension extractor is exhibited by showing that every sequence of positive constructive dimension is Turing equivalent to a sequence of constructive strong dimension arbitrarily close to 1. Similar results are shown for computable dimension and truthtable equivalence, and for pispace dimension and pispace Turing equivalence, where pispace represents Lutz’s hierarchy of superpolynomial space bounds. Thus, with respect to constructive, computable, and pispace information density, any sequence in which almost every prefix has information density bounded away from zero can be used to compute a sequence in which infinitely many prefixes have information density that is nearly maximal. In the constructive dimension case, the reduction is uniform with respect to the input sequence: a single oracle Turing machine, taking as input a rational upper bound on the dimension of the input sequence, works for every input sequence of positive constructive dimension. As an application, the resourcebounded extractors are used to characterize the computable dimension of individual sequences in terms of compression via truthtable reductions and to characterize the pispace dimension of individual sequences in terms of compression via pispacebounded Turing reductions, in analogy to previous known results connecting effective dimensions to compression with effective reductions.
Pushdown dimension
 Theoretical Computer Science
, 2007
"... Abstract Resourcebounded dimension is a notion of computational information density of infinite sequences based on computationally bounded gamblers. This paper develops the theory of pushdown dimension and explores its relationship with finitestate dimension.The pushdown dimension of any sequence ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract Resourcebounded dimension is a notion of computational information density of infinite sequences based on computationally bounded gamblers. This paper develops the theory of pushdown dimension and explores its relationship with finitestate dimension.The pushdown dimension of any sequence is trivially bounded above by its finitestate dimension, since a pushdown gambler can simulate any finitestate gambler. We show thatfor every rational 0 < d < 1, there exists a sequence with finitestate dimension d whosepushdown dimension is at most d/2. This provides a stronger quantitative analogue of thewellknown fact that pushdown automata decide strictly more languages than finitestate
EFFECTIVE PACKING DIMENSION AND TRACEABILITY
"... The concern of this paper is with effective packing dimension. This concept can be traced back to the work of Borel and Lebesgue who studied measure as a way of specifying the size of sets. Carathéodory later generalized Lebesgue measure to the ndimensional Euclidean space, and this was taken furth ..."
Abstract
 Add to MetaCart
The concern of this paper is with effective packing dimension. This concept can be traced back to the work of Borel and Lebesgue who studied measure as a way of specifying the size of sets. Carathéodory later generalized Lebesgue measure to the ndimensional Euclidean space, and this was taken further by Hausdorff [Hau19]