Results 1  10
of
10
The Dimensions of Individual Strings and Sequences
 INFORMATION AND COMPUTATION
, 2003
"... A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary ..."
Abstract

Cited by 93 (10 self)
 Add to MetaCart
A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary) sequence S a dimension, which is a real number dim(S) in the interval [0, 1]. Sequences that
The Fastest And Shortest Algorithm For All WellDefined Problems
, 2002
"... An algorithm M is described that solves any welldefined problem p as quickly as the fastest algorithm computing a solution to p, save for a factor of 5 and loworder additive terms. M optimally distributes resources between the execution of provably correct psolving programs and an enumeration of ..."
Abstract

Cited by 35 (7 self)
 Add to MetaCart
An algorithm M is described that solves any welldefined problem p as quickly as the fastest algorithm computing a solution to p, save for a factor of 5 and loworder additive terms. M optimally distributes resources between the execution of provably correct psolving programs and an enumeration of all proofs, including relevant proofs of program correctness and of time bounds on program runtimes. M avoids Blum's speedup theorem by ignoring programs without correctness proof. M has broader applicability and can be faster than Levin's universal search, the fastest method for inverting functions save for a large multiplicative constant. An extension of Kolmogorov complexity and two novel natural measures of function complexity are used to show that the most efficient program computing some function f is also among the shortest programs provably computing f.
New Error Bounds for Solomonoff Prediction
 Journal of Computer and System Sciences
, 1999
"... Several new relations between universal Solomonoff sequence prediction and informed prediction and general probabilistic prediction schemes will be proved. Among others, they show that the number of errors in Solomonoff prediction is finite for computable prior probability, if finite in the informed ..."
Abstract

Cited by 23 (16 self)
 Add to MetaCart
Several new relations between universal Solomonoff sequence prediction and informed prediction and general probabilistic prediction schemes will be proved. Among others, they show that the number of errors in Solomonoff prediction is finite for computable prior probability, if finite in the informed case, where the prior is known. Deterministic variants will also be studied. The most interesting result is that the deterministic variant of Solomonoff prediction is optimal compared to any other probabilistic or deterministic prediction scheme apart from additive square root corrections only. This makes it well suited even for difficult prediction problems, where it does not suffice when the number of errors is minimal to within some factor greater than one. Solomonoff's original bound and the ones presented here complement each other in a useful way.
Gales suffice for constructive dimension
 Information Processing Letters
, 2003
"... Supergales, generalizations of supermartingales, have been used by Lutz (2002) to define the constructive dimensions of individual binary sequences. Here it is shown that gales, the corresponding generalizations of martingales, can be equivalently used to define constructive dimension. 1 ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
Supergales, generalizations of supermartingales, have been used by Lutz (2002) to define the constructive dimensions of individual binary sequences. Here it is shown that gales, the corresponding generalizations of martingales, can be equivalently used to define constructive dimension. 1
The Boltzmann Entropy and Randomness Tests
, 1994
"... In the context of the dynamical systems of classical mechanics, we introduce two new notions called \algorithmic negrain and coarsegrain entropy". The negrain algorithmic entropy is, on the one hand, a simple variant of the randomness tests of MartinLof (and others) and is, on the other ha ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
In the context of the dynamical systems of classical mechanics, we introduce two new notions called \algorithmic negrain and coarsegrain entropy". The negrain algorithmic entropy is, on the one hand, a simple variant of the randomness tests of MartinLof (and others) and is, on the other hand, a connecting link between description (Kolmogorov) complexity, Gibbs entropy and Boltzmann entropy.
The Kolmogorov Complexity of Random Reals
 Ann. Pure Appl. Logic
, 2003
"... We investigate the initial segment complexity of random reals. Let K(... ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
We investigate the initial segment complexity of random reals. Let K(...
Kolmogorov Complexity Conditional to Large Integers
 Theoretical Computer Science
"... this paper the general notion of an algorithmic problem (see [7] for such discussion), as our paper is devoted to very specic problems. The plain Kolmogorov complexity, K(x), is the Kolmogorov complexity of the problem \print x". Likewise the conditional Kolmogorov complexity, dened as K(xjy) = min ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
this paper the general notion of an algorithmic problem (see [7] for such discussion), as our paper is devoted to very specic problems. The plain Kolmogorov complexity, K(x), is the Kolmogorov complexity of the problem \print x". Likewise the conditional Kolmogorov complexity, dened as K(xjy) = minfl(p) j p(y) = xg; is the complexity of the problem \given y print x"
Performance of Data Compression in terms of Hausdorff Dimension
"... Introduction Let B = f0; 1g, B n = f0; 1g n , B = [ n0 B n , B 1 = f0; 1g 1 . Let be a probability measure on B 1 . x 1 1 2 B 1 is dened to be typical for if for any u 2 B , lim n!1 u (x n 1 ) n juj + 1 = (u); where u (x n 1 ) is the number of occurrences of u ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Introduction Let B = f0; 1g, B n = f0; 1g n , B = [ n0 B n , B 1 = f0; 1g 1 . Let be a probability measure on B 1 . x 1 1 2 B 1 is dened to be typical for if for any u 2 B , lim n!1 u (x n 1 ) n juj + 1 = (u); where u (x n 1 ) is the number of occurrences of u
Algorithmic Information Theory and Machine Learning
, 2000
"... this paper we only consider the context of concept learning : Let X be a set called the instance space. A concept is a subset of X . Usually concepts are identied with their indicating function (by abuse of notations c(x) = 1 x 2 c) A concept class is a set C 2 ..."
Abstract
 Add to MetaCart
this paper we only consider the context of concept learning : Let X be a set called the instance space. A concept is a subset of X . Usually concepts are identied with their indicating function (by abuse of notations c(x) = 1 x 2 c) A concept class is a set C 2
Uniform Randomness Test, over a General Space
, 2003
"... The algorithmic theory of randomness is well developed when the underlying space is the set of finite or infinite sequences and the underlying probability distribution is the uniform distribution or a computable distribution. These restrictions seem artificial. Some progress has been made to extend. ..."
Abstract
 Add to MetaCart
The algorithmic theory of randomness is well developed when the underlying space is the set of finite or infinite sequences and the underlying probability distribution is the uniform distribution or a computable distribution. These restrictions seem artificial. Some progress has been made to extend...