Results 1  10
of
353
Algorithmic information theory
 IBM JOURNAL OF RESEARCH AND DEVELOPMENT
, 1977
"... This paper reviews algorithmic information theory, which is an attempt to apply informationtheoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability that ..."
Abstract

Cited by 337 (18 self)
 Add to MetaCart
This paper reviews algorithmic information theory, which is an attempt to apply informationtheoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability that a program whose bits are chosen by coin flipping produces a given output. During the past few years the definitions of algorithmic information theory have been reformulated. The basic features of the new formalism are presented here and certain results of R. M. Solovay are reported.
Universal prediction of individual sequences
 IEEE Transactions on Information Theory
, 1992
"... AbstructThe problem of predicting the next outcome of an individual binary sequence using finite memory, is considered. The finitestate predictability of an infinite sequence is defined as the minimum fraction of prediction errors that can be made by any finitestate (FS) predictor. It is proved t ..."
Abstract

Cited by 161 (13 self)
 Add to MetaCart
AbstructThe problem of predicting the next outcome of an individual binary sequence using finite memory, is considered. The finitestate predictability of an infinite sequence is defined as the minimum fraction of prediction errors that can be made by any finitestate (FS) predictor. It is proved that this FS predictability can be attained by universal sequential prediction schemes. Specifically, an efficient prediction procedure based on the incremental parsing procedure of the LempelZiv data compression algorithm is shown to achieve asymptotically the FS predictability. Finally, some relations between compressibility and predictability are pointed out, and the predictability is proposed as an additional measure of the complexity of a sequence. Index TermsPredictability, compressibility, complexity, finitestate machines, Lempel Ziv algorithm.
A complexity theoretic approach to randomness, in
 Proceedings of the 15th Annual ACM Symposium on Theory of Computing
, 1983
"... Abstract: We study a time bounded variant of Kolmogorov complexity. This motion, together with universal hashing, can be used to show that problems solvable probabilistically in polynomial time are all within the second level of the polynomial time hierarchy. We also discuss applications to the the ..."
Abstract

Cited by 145 (1 self)
 Add to MetaCart
Abstract: We study a time bounded variant of Kolmogorov complexity. This motion, together with universal hashing, can be used to show that problems solvable probabilistically in polynomial time are all within the second level of the polynomial time hierarchy. We also discuss applications to the theory of probabilistic constructions. I.
The Dimensions of Individual Strings and Sequences
 INFORMATION AND COMPUTATION
, 2003
"... A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary ..."
Abstract

Cited by 87 (11 self)
 Add to MetaCart
(Show Context)
A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary) sequence S a dimension, which is a real number dim(S) in the interval [0, 1]. Sequences that
Lowness Properties and Randomness
 ADVANCES IN MATHEMATICS
"... The set A is low for MartinLof random if each random set is already random relative to A. A is Ktrivial if the prefix complexity K of each initial segment of A is minimal, namely K(n)+O(1). We show that these classes coincide. This implies answers to questions of AmbosSpies and Kucera [2 ..."
Abstract

Cited by 84 (21 self)
 Add to MetaCart
The set A is low for MartinLof random if each random set is already random relative to A. A is Ktrivial if the prefix complexity K of each initial segment of A is minimal, namely K(n)+O(1). We show that these classes coincide. This implies answers to questions of AmbosSpies and Kucera [2], showing that each low for MartinLof random set is # 2 . Our class induces a natural intermediate # 3 ideal in the r.e. Turing degrees (which generates the whole class under downward closure). Answering
Complexity measures of supervised classification problems
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2002
"... AbstractÐWe studied a number of measures that characterize the difficulty of a classification problem, focusing on the geometrical complexity of the class boundary. We compared a set of realworld problems to random labelings of points and found that real problems contain structures in this measurem ..."
Abstract

Cited by 74 (7 self)
 Add to MetaCart
(Show Context)
AbstractÐWe studied a number of measures that characterize the difficulty of a classification problem, focusing on the geometrical complexity of the class boundary. We compared a set of realworld problems to random labelings of points and found that real problems contain structures in this measurement space that are significantly different from the random sets. Distributions of problems in this space show that there exist at least two independent factors affecting a problem's difficulty. We suggest using this space to describe a classifier's domain of competence. This can guide static and dynamic selection of classifiers for specific problems as well as subproblems formed by confinement, projection, and transformations of the feature vectors. Index TermsÐClassification, clustering, complexity, linear separability, mixture identifiability. 1
Minimum Description Length Induction, Bayesianism, and Kolmogorov Complexity
 IEEE Transactions on Information Theory
, 1998
"... The relationship between the Bayesian approach and the minimum description length approach is established. We sharpen and clarify the general modeling principles MDL and MML, abstracted as the ideal MDL principle and defined from Bayes's rule by means of Kolmogorov complexity. The basic conditi ..."
Abstract

Cited by 70 (7 self)
 Add to MetaCart
The relationship between the Bayesian approach and the minimum description length approach is established. We sharpen and clarify the general modeling principles MDL and MML, abstracted as the ideal MDL principle and defined from Bayes's rule by means of Kolmogorov complexity. The basic condition under which the ideal principle should be applied is encapsulated as the Fundamental Inequality, which in broad terms states that the principle is valid when the data are random, relative to every contemplated hypothesis and also these hypotheses are random relative to the (universal) prior. Basically, the ideal principle states that the prior probability associated with the hypothesis should be given by the algorithmic universal probability, and the sum of the log universal probability of the model plus the log of the probability of the data given the model should be minimized. If we restrict the model class to the finite sets then application of the ideal principle turns into Kolmogorov's mi...
Code Density Optimization for Embedded DSP Processors Using Data Compression Techniques
 IEEE Trans. ComputerAided Design
, 1998
"... ..."
(Show Context)
Effective strong dimension in algorithmic information and computational complexity
 SIAM Journal on Computing
, 2004
"... The two most important notions of fractal dimension are Hausdorff dimension, developed by Hausdorff (1919), and packing dimension, developed independently by Tricot (1982) and Sullivan (1984). Both dimensions have the mathematical advantage of being defined from measures, and both have yielded exten ..."
Abstract

Cited by 66 (24 self)
 Add to MetaCart
(Show Context)
The two most important notions of fractal dimension are Hausdorff dimension, developed by Hausdorff (1919), and packing dimension, developed independently by Tricot (1982) and Sullivan (1984). Both dimensions have the mathematical advantage of being defined from measures, and both have yielded extensive applications in fractal geometry and dynamical systems. Lutz (2000) has recently proven a simple characterization of Hausdorff dimension in terms of gales, which are betting strategies that generalize martingales. Imposing various computability and complexity constraints on these gales produces a spectrum of effective versions of Hausdorff dimension, including constructive, computable, polynomialspace, polynomialtime, and finitestate dimensions. Work by several investigators has already used these effective dimensions to shed significant new light on a variety of topics in theoretical computer science. In this paper we show that packing dimension can also be characterized in terms of gales. Moreover, even though the usual definition of packing dimension is considerably more complex than that of Hausdorff dimension, our gale characterization of packing dimension is an exact dual
Calibrating randomness
 J. Symbolic Logic
"... 2. Sets, measure, and martingales 4 2.1. Sets and measure 4 2.2. Martingales 5 ..."
Abstract

Cited by 64 (33 self)
 Add to MetaCart
2. Sets, measure, and martingales 4 2.1. Sets and measure 4 2.2. Martingales 5