Results 1  10
of
24
The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms
 Russian Math. Surveys
, 1970
"... In 1964 Kolmogorov introduced the concept of the complexity of a finite object (for instance, the words in a certain alphabet). He defined complexity as the minimum number of binary signs containing all the information about a given object that are sufficient for its recovery (decoding). This defini ..."
Abstract

Cited by 240 (1 self)
 Add to MetaCart
(Show Context)
In 1964 Kolmogorov introduced the concept of the complexity of a finite object (for instance, the words in a certain alphabet). He defined complexity as the minimum number of binary signs containing all the information about a given object that are sufficient for its recovery (decoding). This definition depends essentially on the method of decoding. However, by means of the general theory of algorithms, Kolmogorov was able to give an invariant (universal) definition of complexity. Related concepts were investigated by Solotionoff (U.S.A.) and Markov. Using the concept of complexity, Kolmogorov gave definitions of the quantity of information in finite objects and of the concept of a random sequence (which was then defined more precisely by MartinLof). Afterwards, this circle of questions developed rapidly. In particular, an interesting development took place of the ideas of Markov on the application of the concept of complexity to the study of quantitative questions in the theory of algorithms. The present article is a survey of the fundamental results connected with the brief remarks above.
The Dimensions of Individual Strings and Sequences
 INFORMATION AND COMPUTATION
, 2003
"... A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary ..."
Abstract

Cited by 101 (11 self)
 Add to MetaCart
(Show Context)
A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary) sequence S a dimension, which is a real number dim(S) in the interval [0, 1]. Sequences that
Randomness in Computability Theory
, 2000
"... We discuss some aspects of algorithmic randomness and state some open problems in this area. The first part is devoted to the question "What is a computably random sequence?" Here we survey some of the approaches to algorithmic randomness and address some questions on these concepts. I ..."
Abstract

Cited by 30 (0 self)
 Add to MetaCart
We discuss some aspects of algorithmic randomness and state some open problems in this area. The first part is devoted to the question "What is a computably random sequence?" Here we survey some of the approaches to algorithmic randomness and address some questions on these concepts. In the second part we look at the Turing degrees of MartinLof random sets. Finally, in the third part we deal with relativized randomness. Here we look at oracles which do not change randomness. 1980 Mathematics Subject Classification. Primary 03D80; Secondary 03D28. 1 Introduction Formalizations of the intuitive notions of computability and randomness are among the major achievements in the foundations of mathematics in the 20th century. It is commonly accepted that various equivalent formal computability notions  like Turing computability or recursiveness  which were introduced in the 1930s and 1940s adequately capture computability in the intuitive sense. This belief is expressed in the w...
KolmogorovLoveland randomness and stochasticity
 Annals of Pure and Applied Logic
, 2005
"... An infinite binary sequence X is KolmogorovLoveland (or KL) random if there is no computable nonmonotonic betting strategy that succeeds on X in the sense of having an unbounded gain in the limit while betting successively on bits of X. A sequence X is KLstochastic if there is no computable nonm ..."
Abstract

Cited by 27 (8 self)
 Add to MetaCart
An infinite binary sequence X is KolmogorovLoveland (or KL) random if there is no computable nonmonotonic betting strategy that succeeds on X in the sense of having an unbounded gain in the limit while betting successively on bits of X. A sequence X is KLstochastic if there is no computable nonmonotonic selection rule that selects from X an infinite, biased sequence. One of the major open problems in the field of effective randomness is whether MartinLöf randomness is the same as KLrandomness. Our first main result states that KLrandom sequences are close to MartinLöf random sequences in so far as every KLrandom sequence has arbitrarily dense subsequences that are MartinLöf random. A key lemma in the proof of this result is that for every effective split of a KLrandom sequence at least one of the halves is MartinLöf random. However, this splitting property does not characterize KLrandomness; we construct a sequence that is not even computably random such that every effective split yields two subsequences that are 2random. Furthermore, we show for any KLrandom sequence A that is computable in the halting problem that, first, for any effective split of A both halves are MartinLöf random and, second, for any computable, nondecreasing, and unbounded function g
Prediction and Dimension
 Journal of Computer and System Sciences
, 2002
"... Given a set X of sequences over a nite alphabet, we investigate the following three quantities. (i) The feasible predictability of X is the highest success ratio that a polynomialtime randomized predictor can achieve on all sequences in X. ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
(Show Context)
Given a set X of sequences over a nite alphabet, we investigate the following three quantities. (i) The feasible predictability of X is the highest success ratio that a polynomialtime randomized predictor can achieve on all sequences in X.
The KolmogorovLoveland stochastic sequences are not closed under selecting subsequences
 Journal of Symbolic Logic
, 2002
"... It is shown that the class of KolmogorovLoveland stochastic sequences is not closed under selecting subsequences by monotonic computable selection rules. This result gives a strong negative answer to the notorious open problem whether the KolmogorovLoveland stochastic sequences are closed unde ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
(Show Context)
It is shown that the class of KolmogorovLoveland stochastic sequences is not closed under selecting subsequences by monotonic computable selection rules. This result gives a strong negative answer to the notorious open problem whether the KolmogorovLoveland stochastic sequences are closed under selecting subsequences by KolmogorovLoveland selection rules, i.e., by not necessarily monotonic partially computable selection rules. As a corollary, we obtain an easy proof for the previously known result that the KolmogorovLoveland stochastic sequences form a proper subclass of the MisesWaldChurch stochastic sequences.
The Complexity of Stochastic Sequences
 In Conference on Computational Complexity 2003
, 2003
"... We observe that known results on the Kolmogorov complexity of pre xes of eectively stochastic sequences extend to corresponding random sequences. First, there are recursively random random sequences such that for any nondecreasing and unbounded computable function f and for almost all n, the unifor ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
(Show Context)
We observe that known results on the Kolmogorov complexity of pre xes of eectively stochastic sequences extend to corresponding random sequences. First, there are recursively random random sequences such that for any nondecreasing and unbounded computable function f and for almost all n, the uniform complexity of the length n pre x of the sequence is bounded by f(n). Second, a similar result with bounds of the form f(n) log n holds for partialrecursively random sequences.
Immunity and Pseudorandomness of ContextFree Languages
, 902
"... Abstract. We examine the computational complexity of contextfree languages, mainly concentrating on two wellknown structural properties—immunity and pseudorandomness. An infinite language is REGimmune (resp., CFLimmune) if it contains no infinite subset that is a regular (resp., contextfree) la ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
(Show Context)
Abstract. We examine the computational complexity of contextfree languages, mainly concentrating on two wellknown structural properties—immunity and pseudorandomness. An infinite language is REGimmune (resp., CFLimmune) if it contains no infinite subset that is a regular (resp., contextfree) language. We prove that (i) there is a contextfree REGimmune language outside REG/n and (ii) there is a REGbiimmune language that can be computed deterministically using logarithmic space. We also show that (iii) there is a CFLsimple set, where a CFLsimple language is an infinite contextfree language whose complement is CFLimmune. Similar to the REGimmunity, a REGprimeimmune language has no polynomially dense subsets that are also regular. We further prove that (iv) there is a contextfree language that is REG/nbiprimeimmune but not even REGimmune. Concerning pseudorandomness of contextfree languages, we show that (v) CFL contains REG/npseudorandom languages. Finally, we prove that (vi) against REG/n, there exists an almost 11 pseudorandom generator computable in nondeterministic pushdown automata equipped with a writeonly output tape and (vii) against REG, there is no almost 11 weak pseudorandom generator computable deterministically in linear time by a singletape Turing machine.
An Algorithmic Complexity Interpretation of Lin’s Third Law of Information Theory
, 2008
"... Abstract: Instead of static entropy we assert that the Kolmogorov complexity of a static structure such as a solid is the proper measure of disorder (or chaoticity). A static structure in a surrounding perfectlyrandom universe acts as an interfering entity which introduces local disruption in rando ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Abstract: Instead of static entropy we assert that the Kolmogorov complexity of a static structure such as a solid is the proper measure of disorder (or chaoticity). A static structure in a surrounding perfectlyrandom universe acts as an interfering entity which introduces local disruption in randomness. This is modeled by a selection rule R which selects a subsequence of the random input sequence that hits the structure. Through the inequality that relates stochasticity and chaoticity of random binary sequences we maintain that Lin’s notion of stability corresponds to the stability of the frequency of 1s in the selected subsequence. This explains why more complex static structures are less stable. Lin’s third law is represented as the inevitable change that static structure undergo towards conforming to the universe’s perfect randomness.
Feasible Reductions to KolmogorovLoveland Stochastic Sequences
 THEOR. COMPUT. SCI
, 1999
"... For every binary sequence A, there is an infinite binary sequence S such that A P tt S and S is stochastic in the sense of Kolmogorov and Loveland. ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
For every binary sequence A, there is an infinite binary sequence S such that A P tt S and S is stochastic in the sense of Kolmogorov and Loveland.