Results 1 
6 of
6
The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms
 Russian Math. Surveys
, 1970
"... In 1964 Kolmogorov introduced the concept of the complexity of a finite object (for instance, the words in a certain alphabet). He defined complexity as the minimum number of binary signs containing all the information about a given object that are sufficient for its recovery (decoding). This defini ..."
Abstract

Cited by 189 (1 self)
 Add to MetaCart
In 1964 Kolmogorov introduced the concept of the complexity of a finite object (for instance, the words in a certain alphabet). He defined complexity as the minimum number of binary signs containing all the information about a given object that are sufficient for its recovery (decoding). This definition depends essentially on the method of decoding. However, by means of the general theory of algorithms, Kolmogorov was able to give an invariant (universal) definition of complexity. Related concepts were investigated by Solotionoff (U.S.A.) and Markov. Using the concept of complexity, Kolmogorov gave definitions of the quantity of information in finite objects and of the concept of a random sequence (which was then defined more precisely by MartinLof). Afterwards, this circle of questions developed rapidly. In particular, an interesting development took place of the ideas of Markov on the application of the concept of complexity to the study of quantitative questions in the theory of algorithms. The present article is a survey of the fundamental results connected with the brief remarks above.
The Power of Vacillation in Language Learning
, 1992
"... Some extensions are considered of Gold's influential model of language learning by machine from positive data. Studied are criteria of successful learning featuring convergence in the limit to vacillation between several alternative correct grammars. The main theorem of this paper is that there are ..."
Abstract

Cited by 44 (11 self)
 Add to MetaCart
Some extensions are considered of Gold's influential model of language learning by machine from positive data. Studied are criteria of successful learning featuring convergence in the limit to vacillation between several alternative correct grammars. The main theorem of this paper is that there are classes of languages that can be learned if convergence in the limit to up to (n+1) exactly correct grammars is allowed but which cannot be learned if convergence in the limit is to no more than n grammars, where the no more than n grammars can each make finitely many mistakes. This contrasts sharply with results of Barzdin and Podnieks and, later, Case and Smith, for learnability from both positive and negative data. A subset principle from a 1980 paper of Angluin is extended to the vacillatory and other criteria of this paper. This principle, provides a necessary condition for circumventing overgeneralization in learning from positive data. It is applied to prove another theorem to the eff...
ON THE COMPUTABILITY OF CONDITIONAL PROBABILITY
"... Abstract. We study the problem of computing conditional probabilities, a fundamental operation in statistics and machine learning. In the elementary discrete setting, conditional probability is defined axiomatically and the search for more constructive definitions is the subject of a rich literature ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Abstract. We study the problem of computing conditional probabilities, a fundamental operation in statistics and machine learning. In the elementary discrete setting, conditional probability is defined axiomatically and the search for more constructive definitions is the subject of a rich literature in probability theory and statistics. In the discrete or dominated setting, under suitable computability hypotheses, conditional probabilities are computable. However, we show that in general one cannot compute conditional probabilities. We do this by constructing a pair of computable random variables in the unit interval whose conditional distribution encodes the halting problem at almost every point. We show that this result is tight, in the sense that given an oracle for the halting problem, one can compute this conditional distribution. On the other hand, we show that conditioning in abstract settings is computable in the presence of certain additional structure, such as independent absolutely continuous noise. 1.
TURING DEGREES OF REALS OF POSITIVE EFFECTIVE PACKING DIMENSION
"... Abstract. A relatively longstanding question in algorithmic randomness is Jan Reimann’s question whether there is a Turing cone of broken dimension. That is, is there a real A such that {B: B ≤T A} contains no 1random real, yet contains elements of nonzero effective Hausdorff Dimension? We show tha ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. A relatively longstanding question in algorithmic randomness is Jan Reimann’s question whether there is a Turing cone of broken dimension. That is, is there a real A such that {B: B ≤T A} contains no 1random real, yet contains elements of nonzero effective Hausdorff Dimension? We show that the answer is affirmative if Hausdorff dimension is replaced by its inner analogue packing dimension. We construct a minimal degree of effective packing dimension 1. This leads us to examine the Turing degrees of reals with positive effective packing dimension. Unlike effective Hausdorff dimension, this is a notion of complexity which is shared by both random and sufficiently generic reals. We provide a characterization of the c.e. array noncomputable degrees in terms of effective packing dimension. 1.
Algorithmic Randomness and Computability
"... Abstract. We examine some recent work which has made significant progress in out understanding of algorithmic randomness, relative algorithmic randomness and their relationship with algorithmic computability and relative algorithmic computability. ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract. We examine some recent work which has made significant progress in out understanding of algorithmic randomness, relative algorithmic randomness and their relationship with algorithmic computability and relative algorithmic computability.
NONCUPPING, MEASURE AND COMPUTABLY ENUMERABLE SPLITTINGS
"... Abstract. We show that there is a computably enumerable function f (i.e. computably approximable from below) which dominates almost all functions and f ⊕ W is incomplete, for all incomplete computably enumerable sets W. Our main methodology is the LR equivalence relation on reals: A ≡LR B iff the no ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. We show that there is a computably enumerable function f (i.e. computably approximable from below) which dominates almost all functions and f ⊕ W is incomplete, for all incomplete computably enumerable sets W. Our main methodology is the LR equivalence relation on reals: A ≡LR B iff the notions of Arandomness and Brandomness coincide. We also show that there are c.e. sets which cannot be split into two c.e. sets of the same LR degree. Moreover a c.e. set is low for random iff it computes no c.e. set with this property. 1.