Results 1  10
of
32
Algorithmic information theory
 IBM JOURNAL OF RESEARCH AND DEVELOPMENT
, 1977
"... This paper reviews algorithmic information theory, which is an attempt to apply informationtheoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability that ..."
Abstract

Cited by 325 (19 self)
 Add to MetaCart
This paper reviews algorithmic information theory, which is an attempt to apply informationtheoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability that a program whose bits are chosen by coin flipping produces a given output. During the past few years the definitions of algorithmic information theory have been reformulated. The basic features of the new formalism are presented here and certain results of R. M. Solovay are reported.
On the Length of Programs for Computing Finite Binary Sequences
 Journal of the ACM
, 1966
"... The use of Turing machines for calculating finite binary sequences is studied from the point of view of information theory and the theory of recursive functions. Various results are obtained concerning the number of instructions in programs. A modified form of Turing machine is studied from the same ..."
Abstract

Cited by 231 (8 self)
 Add to MetaCart
The use of Turing machines for calculating finite binary sequences is studied from the point of view of information theory and the theory of recursive functions. Various results are obtained concerning the number of instructions in programs. A modified form of Turing machine is studied from the same point of view. An application to the problem of defining a patternless sequence is proposed in terms of the concepts here 2 G. J. Chaitin developed. Introduction In this paper the Turing machine is regarded as a general purpose computer and some practical questions are asked about programming it. Given an arbitrary finite binary sequence, what is the length of the shortest program for calculating it? What are the properties of those binary sequences of a given length which require the longest programs? Do most of the binary sequences of a given length require programs of about the same length? The questions posed above are answered in Part 1. In the course of answering them, the logical ...
The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms
 Russian Math. Surveys
, 1970
"... In 1964 Kolmogorov introduced the concept of the complexity of a finite object (for instance, the words in a certain alphabet). He defined complexity as the minimum number of binary signs containing all the information about a given object that are sufficient for its recovery (decoding). This defini ..."
Abstract

Cited by 190 (1 self)
 Add to MetaCart
In 1964 Kolmogorov introduced the concept of the complexity of a finite object (for instance, the words in a certain alphabet). He defined complexity as the minimum number of binary signs containing all the information about a given object that are sufficient for its recovery (decoding). This definition depends essentially on the method of decoding. However, by means of the general theory of algorithms, Kolmogorov was able to give an invariant (universal) definition of complexity. Related concepts were investigated by Solotionoff (U.S.A.) and Markov. Using the concept of complexity, Kolmogorov gave definitions of the quantity of information in finite objects and of the concept of a random sequence (which was then defined more precisely by MartinLof). Afterwards, this circle of questions developed rapidly. In particular, an interesting development took place of the ideas of Markov on the application of the concept of complexity to the study of quantitative questions in the theory of algorithms. The present article is a survey of the fundamental results connected with the brief remarks above.
The Dimensions of Individual Strings and Sequences
 INFORMATION AND COMPUTATION
, 2003
"... A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary ..."
Abstract

Cited by 93 (10 self)
 Add to MetaCart
A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary) sequence S a dimension, which is a real number dim(S) in the interval [0, 1]. Sequences that
Randomness in Computability Theory
, 2000
"... We discuss some aspects of algorithmic randomness and state some open problems in this area. The first part is devoted to the question "What is a computably random sequence?" Here we survey some of the approaches to algorithmic randomness and address some questions on these concepts. I ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
We discuss some aspects of algorithmic randomness and state some open problems in this area. The first part is devoted to the question "What is a computably random sequence?" Here we survey some of the approaches to algorithmic randomness and address some questions on these concepts. In the second part we look at the Turing degrees of MartinLof random sets. Finally, in the third part we deal with relativized randomness. Here we look at oracles which do not change randomness. 1980 Mathematics Subject Classification. Primary 03D80; Secondary 03D28. 1 Introduction Formalizations of the intuitive notions of computability and randomness are among the major achievements in the foundations of mathematics in the 20th century. It is commonly accepted that various equivalent formal computability notions  like Turing computability or recursiveness  which were introduced in the 1930s and 1940s adequately capture computability in the intuitive sense. This belief is expressed in the w...
Prediction and Dimension
 Journal of Computer and System Sciences
, 2002
"... Given a set X of sequences over a nite alphabet, we investigate the following three quantities. (i) The feasible predictability of X is the highest success ratio that a polynomialtime randomized predictor can achieve on all sequences in X. ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
Given a set X of sequences over a nite alphabet, we investigate the following three quantities. (i) The feasible predictability of X is the highest success ratio that a polynomialtime randomized predictor can achieve on all sequences in X.
Algorithmic Complexity and Stochastic Properties of Finite Binary Sequences
, 1999
"... This paper is a survey of concepts and results related to simple Kolmogorov complexity, prefix complexity and resourcebounded complexity. We also consider a new type of complexity statistical complexity closely related to mathematical statistics. Unlike other discoverers of algorithmic complexit ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
This paper is a survey of concepts and results related to simple Kolmogorov complexity, prefix complexity and resourcebounded complexity. We also consider a new type of complexity statistical complexity closely related to mathematical statistics. Unlike other discoverers of algorithmic complexity, A. N. Kolmogorov's leading motive was developing on its basis a mathematical theory more adequately substantiating applications of probability theory, mathematical statistics and information theory. Kolmogorov wanted to deduce properties of a random object from its complexity characteristics without use of the notion of probability. In the first part of this paper we present several results in this direction. Though the subsequent development of algorithmic complexity and randomness was different, algorithmic complexity has successful applications in a traditional probabilistic framework. In the second part of the paper we consider applications to the estimation of parameters and the definition of Bernoulli sequences. All considerations have finite combinatorial character. 1.
KolmogorovLoveland randomness and stochasticity
 Annals of Pure and Applied Logic
, 2005
"... An infinite binary sequence X is KolmogorovLoveland (or KL) random if there is no computable nonmonotonic betting strategy that succeeds on X in the sense of having an unbounded gain in the limit while betting successively on bits of X. A sequence X is KLstochastic if there is no computable nonm ..."
Abstract

Cited by 16 (8 self)
 Add to MetaCart
An infinite binary sequence X is KolmogorovLoveland (or KL) random if there is no computable nonmonotonic betting strategy that succeeds on X in the sense of having an unbounded gain in the limit while betting successively on bits of X. A sequence X is KLstochastic if there is no computable nonmonotonic selection rule that selects from X an infinite, biased sequence. One of the major open problems in the field of effective randomness is whether MartinLöf randomness is the same as KLrandomness. Our first main result states that KLrandom sequences are close to MartinLöf random sequences in so far as every KLrandom sequence has arbitrarily dense subsequences that are MartinLöf random. A key lemma in the proof of this result is that for every effective split of a KLrandom sequence at least one of the halves is MartinLöf random. However, this splitting property does not characterize KLrandomness; we construct a sequence that is not even computably random such that every effective split yields two subsequences that are 2random. Furthermore, we show for any KLrandom sequence A that is computable in the halting problem that, first, for any effective split of A both halves are MartinLöf random and, second, for any computable, nondecreasing, and unbounded function g