Results 1  10
of
11
The Dimensions of Individual Strings and Sequences
 INFORMATION AND COMPUTATION
, 2003
"... A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary ..."
Abstract

Cited by 95 (10 self)
 Add to MetaCart
A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary) sequence S a dimension, which is a real number dim(S) in the interval [0, 1]. Sequences that
Effective strong dimension in algorithmic information and computational complexity
 SIAM Journal on Computing
, 2004
"... The two most important notions of fractal dimension are Hausdorff dimension, developed by Hausdorff (1919), and packing dimension, developed independently by Tricot (1982) and Sullivan (1984). Both dimensions have the mathematical advantage of being defined from measures, and both have yielded exten ..."
Abstract

Cited by 82 (30 self)
 Add to MetaCart
The two most important notions of fractal dimension are Hausdorff dimension, developed by Hausdorff (1919), and packing dimension, developed independently by Tricot (1982) and Sullivan (1984). Both dimensions have the mathematical advantage of being defined from measures, and both have yielded extensive applications in fractal geometry and dynamical systems. Lutz (2000) has recently proven a simple characterization of Hausdorff dimension in terms of gales, which are betting strategies that generalize martingales. Imposing various computability and complexity constraints on these gales produces a spectrum of effective versions of Hausdorff dimension, including constructive, computable, polynomialspace, polynomialtime, and finitestate dimensions. Work by several investigators has already used these effective dimensions to shed significant new light on a variety of topics in theoretical computer science. In this paper we show that packing dimension can also be characterized in terms of gales. Moreover, even though the usual definition of packing dimension is considerably more complex than that of Hausdorff dimension, our gale characterization of packing dimension is an exact dual
Equivalence of Measures of Complexity Classes
"... The resourcebounded measures of complexity classes are shown to be robust with respect to certain changes in the underlying probability measure. Specifically, for any real number ffi ? 0, any uniformly polynomialtime computable sequence ~ fi = (fi 0 ; fi 1 ; fi 2 ; : : : ) of real numbers (biases ..."
Abstract

Cited by 73 (21 self)
 Add to MetaCart
The resourcebounded measures of complexity classes are shown to be robust with respect to certain changes in the underlying probability measure. Specifically, for any real number ffi ? 0, any uniformly polynomialtime computable sequence ~ fi = (fi 0 ; fi 1 ; fi 2 ; : : : ) of real numbers (biases) fi i 2 [ffi; 1 \Gamma ffi], and any complexity class C (such as P, NP, BPP, P/Poly, PH, PSPACE, etc.) that is closed under positive, polynomialtime, truthtable reductions with queries of at most linear length, it is shown that the following two conditions are equivalent. (1) C has pmeasure 0 (respectively, measure 0 in E, measure 0 in E 2 ) relative to the cointoss probability measure given by the sequence ~ fi. (2) C has pmeasure 0 (respectively, measure 0 in E, measure 0 in E 2 ) relative to the uniform probability measure. The proof introduces three techniques that may be useful in other contexts, namely, (i) the transformation of an efficient martingale for one probability measu...
Competitive online statistics
 International Statistical Review
, 1999
"... A radically new approach to statistical modelling, which combines mathematical techniques of Bayesian statistics with the philosophy of the theory of competitive online algorithms, has arisen over the last decade in computer science (to a large degree, under the influence of Dawid’s prequential sta ..."
Abstract

Cited by 63 (10 self)
 Add to MetaCart
A radically new approach to statistical modelling, which combines mathematical techniques of Bayesian statistics with the philosophy of the theory of competitive online algorithms, has arisen over the last decade in computer science (to a large degree, under the influence of Dawid’s prequential statistics). In this approach, which we call “competitive online statistics”, it is not assumed that data are generated by some stochastic mechanism; the bounds derived for the performance of competitive online statistical procedures are guaranteed to hold (and not just hold with high probability or on the average). This paper reviews some results in this area; the new material in it includes the proofs for the performance of the Aggregating Algorithm in the problem of linear regression with square loss. Keywords: Bayes’s rule, competitive online algorithms, linear regression, prequential statistics, worstcase analysis.
On Semimeasures Predicting MartinLöf Random Sequences
, 2006
"... Solomonoff’s central result on induction is that the prediction of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating predictor µ, if the latter is computable. Hence, M is eligible as a universal sequence predictor in case of unknown µ. Despite some ne ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Solomonoff’s central result on induction is that the prediction of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating predictor µ, if the latter is computable. Hence, M is eligible as a universal sequence predictor in case of unknown µ. Despite some nearby results and proofs in the literature, the stronger result of convergence for all (MartinLöf) random sequences remained open. Such a convergence result would be particularly interesting and natural, since randomness can be defined in terms of M itself. We show that there are universal semimeasures M which do not converge to µ on all µrandom sequences, i.e. we give a partial negative answer to the open problem. We also provide a positive answer for some nonuniversal semimeasures. We define the incomputable measure D as a mixture over all computable measures and the enumerable semimeasure W as a mixture over all enumerable nearlymeasures. We show that W converges to D and D to µ on all random sequences. The Hellinger distance measuring closeness of two distributions plays a central role.
Constructive equivalence relations on computable probability measures
 International Computer Science Symposium in Russia, Lecture Notes in Computer Science
, 2006
"... Abstract. We study the equivalence relations on probability measures corresponding respectively to having the same MartinLöf random reals, having the same KolmogorovLoveland random reals, and having the same computably random reals. In particular, we show that, when restricted to the class of stro ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. We study the equivalence relations on probability measures corresponding respectively to having the same MartinLöf random reals, having the same KolmogorovLoveland random reals, and having the same computably random reals. In particular, we show that, when restricted to the class of strongly positive generalized Bernoulli measures, they all coincide with the classical equivalence, which requires that two measures have the same nullsets. 1
Universal convergence of semimeasures on individual random sequences, in
 Proc. 15th Int. Conf. Algorithmic Learning Theory (ALT’04), LNAI
, 2004
"... Solomonoff’s central result on induction is that the posterior of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating posterior µ, if the latter is computable. Hence, M is eligible as a universal sequence predictor in case of unknown µ. Despite some nea ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Solomonoff’s central result on induction is that the posterior of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating posterior µ, if the latter is computable. Hence, M is eligible as a universal sequence predictor in case of unknown µ. Despite some nearby results and proofs in the literature, the stronger result of convergence for all (MartinLöf) random sequences remained open. Such a convergence result would be particularly interesting and natural, since randomness can be defined in terms of M itself. We show that there are universal semimeasures M which do not converge for all random sequences, i.e. we give a partial negative answer to the open problem. We also provide a positive answer for some nonuniversal semimeasures. We define the incomputable measure D as a mixture over all computable measures and the enumerable semimeasure W as a mixture over all enumerable nearlymeasures. We show that W converges to D and D to µ on all random sequences. The Hellinger distance measuring closeness of two distributions plays a central role.
On generalized computable universal priors and their convergence
 Theoretical Computer Science
"... Solomonoff unified Occam’s razor and Epicurus ’ principle of multiple explanations to one elegant, formal, universal theory of inductive inference, which initiated the field of algorithmic information theory. His central result is that the posterior of the universal semimeasure M converges rapidly t ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Solomonoff unified Occam’s razor and Epicurus ’ principle of multiple explanations to one elegant, formal, universal theory of inductive inference, which initiated the field of algorithmic information theory. His central result is that the posterior of the universal semimeasure M converges rapidly to the true sequence generating posterior µ, if the latter is computable. Hence, M is eligible as a universal predictor in case of unknown µ. The first part of the paper investigates the existence and convergence of computable universal (semi)measures for a hierarchy of computability classes: recursive, estimable, enumerable, and approximable. For instance, M is known to be enumerable, but not estimable, and to dominate all enumerable semimeasures. We present proofs for discrete and continuous semimeasures. The second part investigates more closely the types of convergence, possibly implied by universality: in difference and in ratio, with probability 1, in mean sum, and for MartinLöf random sequences. We introduce a generalized concept of randomness for individual sequences and use it to exhibit difficulties regarding these issues. In particular, we show that convergence fails (holds) on generalizedrandom sequences in gappy (dense) Bernoulli classes.
Kolmogorov's Complexity Conception of Probability
 Probability Theory: Philosophy, Recent History and Relations to Science
, 2000
"... Kolmogorov's goal in proposing his complexity conception of probability was to provide a better foundation for the applications of probability (as opposed to the theory of probability; he believed that his 1933 axioms were sufficient for the theory of probability). The complexity conception was ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Kolmogorov's goal in proposing his complexity conception of probability was to provide a better foundation for the applications of probability (as opposed to the theory of probability; he believed that his 1933 axioms were sufficient for the theory of probability). The complexity conception was a natural development of Kolmogorov's earlier frequentist conception combined with (a) his conviction that only finite data sequences are of any interest in the applications of probability, and (b) Turing's discovery of the universal computing device. Besides the complexity conception itself, its developments by MartinLof, Levin et al will be briefly discussed; I will also list some advantages and limitations of Kolmogorov's complexity conception and the algorithmic theory of randomness in general. Introduction 1 Theory Applications Kolmogorov's axiomatic conception of probability Modern algorithmic theory of randomness Kolmogorov's complexity conception of probability Conditi...
Merging of opinions in gametheoretic probability
, 2008
"... This paper gives gametheoretic versions of several results on “merging of opinions ” obtained in measuretheoretic probability and algorithmic randomness theory. An advantage of the gametheoretic versions over the measuretheoretic results is that they are pointwise, their advantage over the algor ..."
Abstract
 Add to MetaCart
This paper gives gametheoretic versions of several results on “merging of opinions ” obtained in measuretheoretic probability and algorithmic randomness theory. An advantage of the gametheoretic versions over the measuretheoretic results is that they are pointwise, their advantage over the algorithmic randomness results is that they are nonasymptotic, but the most important advantage over both is that they are very constructive, giving explicit and efficient strategies for players in a game of prediction. 1