Results 1  10
of
15
The Dimensions of Individual Strings and Sequences
 INFORMATION AND COMPUTATION
, 2003
"... A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary ..."
Abstract

Cited by 99 (11 self)
 Add to MetaCart
(Show Context)
A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary) sequence S a dimension, which is a real number dim(S) in the interval [0, 1]. Sequences that
Equivalence of Measures of Complexity Classes
"... The resourcebounded measures of complexity classes are shown to be robust with respect to certain changes in the underlying probability measure. Specifically, for any real number ffi ? 0, any uniformly polynomialtime computable sequence ~ fi = (fi 0 ; fi 1 ; fi 2 ; : : : ) of real numbers (biases ..."
Abstract

Cited by 73 (23 self)
 Add to MetaCart
The resourcebounded measures of complexity classes are shown to be robust with respect to certain changes in the underlying probability measure. Specifically, for any real number ffi ? 0, any uniformly polynomialtime computable sequence ~ fi = (fi 0 ; fi 1 ; fi 2 ; : : : ) of real numbers (biases) fi i 2 [ffi; 1 \Gamma ffi], and any complexity class C (such as P, NP, BPP, P/Poly, PH, PSPACE, etc.) that is closed under positive, polynomialtime, truthtable reductions with queries of at most linear length, it is shown that the following two conditions are equivalent. (1) C has pmeasure 0 (respectively, measure 0 in E, measure 0 in E 2 ) relative to the cointoss probability measure given by the sequence ~ fi. (2) C has pmeasure 0 (respectively, measure 0 in E, measure 0 in E 2 ) relative to the uniform probability measure. The proof introduces three techniques that may be useful in other contexts, namely, (i) the transformation of an efficient martingale for one probability measu...
The Fastest And Shortest Algorithm For All WellDefined Problems
, 2002
"... An algorithm M is described that solves any welldefined problem p as quickly as the fastest algorithm computing a solution to p, save for a factor of 5 and loworder additive terms. M optimally distributes resources between the execution of provably correct psolving programs and an enumeration of ..."
Abstract

Cited by 43 (7 self)
 Add to MetaCart
An algorithm M is described that solves any welldefined problem p as quickly as the fastest algorithm computing a solution to p, save for a factor of 5 and loworder additive terms. M optimally distributes resources between the execution of provably correct psolving programs and an enumeration of all proofs, including relevant proofs of program correctness and of time bounds on program runtimes. M avoids Blum's speedup theorem by ignoring programs without correctness proof. M has broader applicability and can be faster than Levin's universal search, the fastest method for inverting functions save for a large multiplicative constant. An extension of Kolmogorov complexity and two novel natural measures of function complexity are used to show that the most efficient program computing some function f is also among the shortest programs provably computing f.
New Error Bounds for Solomonoff Prediction
 Journal of Computer and System Sciences
, 1999
"... Several new relations between universal Solomonoff sequence prediction and informed prediction and general probabilistic prediction schemes will be proved. Among others, they show that the number of errors in Solomonoff prediction is finite for computable prior probability, if finite in the informed ..."
Abstract

Cited by 22 (16 self)
 Add to MetaCart
(Show Context)
Several new relations between universal Solomonoff sequence prediction and informed prediction and general probabilistic prediction schemes will be proved. Among others, they show that the number of errors in Solomonoff prediction is finite for computable prior probability, if finite in the informed case, where the prior is known. Deterministic variants will also be studied. The most interesting result is that the deterministic variant of Solomonoff prediction is optimal compared to any other probabilistic or deterministic prediction scheme apart from additive square root corrections only. This makes it well suited even for difficult prediction problems, where it does not suffice when the number of errors is minimal to within some factor greater than one. Solomonoff's original bound and the ones presented here complement each other in a useful way.
Gales suffice for constructive dimension
 Information Processing Letters
, 2003
"... Supergales, generalizations of supermartingales, have been used by Lutz (2002) to define the constructive dimensions of individual binary sequences. Here it is shown that gales, the corresponding generalizations of martingales, can be equivalently used to define constructive dimension. 1 ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
Supergales, generalizations of supermartingales, have been used by Lutz (2002) to define the constructive dimensions of individual binary sequences. Here it is shown that gales, the corresponding generalizations of martingales, can be equivalently used to define constructive dimension. 1
The Boltzmann Entropy and Randomness Tests
, 1994
"... In the context of the dynamical systems of classical mechanics, we introduce two new notions called \algorithmic negrain and coarsegrain entropy". The negrain algorithmic entropy is, on the one hand, a simple variant of the randomness tests of MartinLof (and others) and is, on the oth ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
In the context of the dynamical systems of classical mechanics, we introduce two new notions called \algorithmic negrain and coarsegrain entropy". The negrain algorithmic entropy is, on the one hand, a simple variant of the randomness tests of MartinLof (and others) and is, on the other hand, a connecting link between description (Kolmogorov) complexity, Gibbs entropy and Boltzmann entropy.
Kolmogorov Complexity Conditional to Large Integers
 Theoretical Computer Science
"... this paper the general notion of an algorithmic problem (see [7] for such discussion), as our paper is devoted to very specic problems. The plain Kolmogorov complexity, K(x), is the Kolmogorov complexity of the problem \print x". Likewise the conditional Kolmogorov complexity, dened as K(xjy) ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
this paper the general notion of an algorithmic problem (see [7] for such discussion), as our paper is devoted to very specic problems. The plain Kolmogorov complexity, K(x), is the Kolmogorov complexity of the problem \print x". Likewise the conditional Kolmogorov complexity, dened as K(xjy) = minfl(p) j p(y) = xg; is the complexity of the problem \given y print x"
Performance of Data Compression in terms of Hausdorff Dimension
"... Introduction Let B = f0; 1g, B n = f0; 1g n , B = [ n0 B n , B 1 = f0; 1g 1 . Let be a probability measure on B 1 . x 1 1 2 B 1 is dened to be typical for if for any u 2 B , lim n!1 u (x n 1 ) n juj + 1 = (u); where u (x n 1 ) is the number of occurrences of u ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Introduction Let B = f0; 1g, B n = f0; 1g n , B = [ n0 B n , B 1 = f0; 1g 1 . Let be a probability measure on B 1 . x 1 1 2 B 1 is dened to be typical for if for any u 2 B , lim n!1 u (x n 1 ) n juj + 1 = (u); where u (x n 1 ) is the number of occurrences of u
Algorithmic randomness over general spaces
, 2012
"... The study of MartinLof randomness on a computable metric space with a computable measure has had much progress recently. In this paper we study MartinLof randomness on a more general space, that is, a computable topological space with a computable measure. On such a space, MartinLof randomness m ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
The study of MartinLof randomness on a computable metric space with a computable measure has had much progress recently. In this paper we study MartinLof randomness on a more general space, that is, a computable topological space with a computable measure. On such a space, MartinLof randomness may not be a natural notion because there is not a universal test, and MartinLof randomness and complexity randomness (de ned in this paper) do not coincide in general. We show that SCT3 is a sucient condition for the existence and the coincidence and study how much we can weaken the condition.