Results 1  10
of
15
A Guided Tour of Minimal Indices and Shortest Descriptions
 Archives for Mathematical Logic
, 1997
"... The set of minimal indices of a G#del numbering ' is deøned as MIN' = fe : (8i ! e)[' i 6= 'e ]g. It has been known since 1972 that MIN' jT ; 00 , but beyond this MIN' has remained mostly uninvestigated. This thesis collects the scarce results on MIN' from the literature and adds some new observa ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
The set of minimal indices of a G#del numbering ' is deøned as MIN' = fe : (8i ! e)[' i 6= 'e ]g. It has been known since 1972 that MIN' jT ; 00 , but beyond this MIN' has remained mostly uninvestigated. This thesis collects the scarce results on MIN' from the literature and adds some new observations including that MIN' is autoreducible, but neither regressive nor (1; 2) computable. We also study several variants of MIN' that have been deøned in the literature like sizeminimal indices, shortest descriptions, and minimal indices of decision tables. Some challenging open problems are left for the adventurous reader. 1 Introduction How long is the shortest program that solves your problem? There are at least two ways to interpret this question depending on the type of problem involved. If the program's task is to output one speciøc object, we are looking for a shortest description of that object. This interpretation is closely related to Kolmogorov complexity. Although we have sev...
Learning in Friedberg Numberings
 Algorithmic Learning Theory: 18th International Conference, ALT 2007, Sendai, Japan, 2007, Proceedings. Springer, Lecture Notes in Artificial Intelligence
"... Abstract. In this paper we consider learnability in some special numberings, such as Friedberg numberings, which contain all the recursively enumerable languages, but have simpler grammar equivalence problem compared to acceptable numberings. We show that every explanatorily learnable class can be l ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Abstract. In this paper we consider learnability in some special numberings, such as Friedberg numberings, which contain all the recursively enumerable languages, but have simpler grammar equivalence problem compared to acceptable numberings. We show that every explanatorily learnable class can be learnt in some Friedberg numbering. However, such a result does not hold for behaviourally correct learning or finite learning. One can also show that some Friedberg numberings are so restrictive that all classes which can be explanatorily learnt in such Friedberg numberings have only finitely many infinite languages. We also study similar questions for several properties of learners such as consistency, conservativeness, prudence, iterativeness and non Ushaped learning. Besides Friedberg numberings, we also consider the above problems for programming systems with Krecursive grammar equivalence problem. 1
Control Structures in Hypothesis Spaces: The Influence on Learning
"... . In any learnability setting, hypotheses are conjectured from some hypothesis space. Studied herein are the effects on learnability of the presence or absence of certain control structures in the hypothesis space. First presented are control structure characterizations of some rather specific but ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
. In any learnability setting, hypotheses are conjectured from some hypothesis space. Studied herein are the effects on learnability of the presence or absence of certain control structures in the hypothesis space. First presented are control structure characterizations of some rather specific but illustrative learnability results. Then presented are the main theorems. Each of these characterizes the invariance of a learning class over hypothesis space V (and a little more about V ) as: V has suitable instances of all denotational control structures. 1 Introduction In any learnability setting, hypotheses are conjectured from some hypothesis space, for example, in [OSW86] from general purpose programming systems, in [ZL95, Wie78] from subrecursive systems, and in [Qui92] from very simple classes of classificatory decision trees. 3 Much is known theoretically about the restrictions on learning power resulting from restricted hypothesis spaces [ZL95]. In the present paper we begin to...
A Short History of Minimal Indices
, 1996
"... ing from concrete machine models the question translates into minimal indices with respect to a numbering of the computable, partial functions. The first part of the paper tells the history of this problem collecting the known results. The second part offers some new observations, and the last part ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
ing from concrete machine models the question translates into minimal indices with respect to a numbering of the computable, partial functions. The first part of the paper tells the history of this problem collecting the known results. The second part offers some new observations, and the last part concludes with a list of open problems. We will only consider Godel numberings. A Godel numbering is an effective numbering ' of all computable partial functions such that for every effective numbering / a 'index can be computed from a /index. We will also use Kolmogorov numberings. A Godel numbering is a Kolmogorov numbering, if there is a linearly bounded computable function that transforms /indices into 'indices. It is well known that Kolmogorov numberings exist. Definition 1.1 Let ' be a Godel numbering. Define MIN' := fe : (8i ! e)[' i 6= ' e ]g; the set of minimal indices of '. What would happen if instead of Godel numberings arbitrary numberings of the computable, partial fun...
Report: Bertelsmann wants all of Napster. http://www.usatoday.com/life/cyber/invest/2002/04/05/napster.htm
 Algorithmic Learning Theory, 18th International Conference, ALT 2007, Springer Lecture Notes in Artificial Intelligence 4754:64–78
, 2002
"... Abstract. This work extends studies of Angluin, Lange and Zeugmann on the dependence of learning on the hypotheses space chosen for the class. In subsequent investigations, uniformly recursively enumerable hypotheses spaces have been considered. In the present work, the following four types of learn ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. This work extends studies of Angluin, Lange and Zeugmann on the dependence of learning on the hypotheses space chosen for the class. In subsequent investigations, uniformly recursively enumerable hypotheses spaces have been considered. In the present work, the following four types of learning are distinguished: classcomprising (where the learner can choose a uniformly recursively enumerable superclass as hypotheses space), classpreserving (where the learner has to choose a uniformly recursively enumerable hypotheses space of the same class), prescribed (where there must be a learner for every uniformly recursively enumerable hypotheses space of the same class) and uniform (like prescribed, but the learner has to be synthesized effectively from an index of the hypothesis space). While for explanatory learning, these four types of learnability coincide, some or all are different for other learning criteria. For example, for conservative learning, all four types are different. Several results are obtained for vacillatory and behaviourally correct learning; three of the four types can be separated, however the relation between prescribed and uniform learning remains open. It is also shown that every (not necessarily uniformly recursively enumerable) behaviourally correct learnable class has a prudent learner, that is, a learner using a hypotheses space such that it learns every set in the hypotheses space. Moreover the prudent learner can be effectively built from any learner for the class. 1
Numberings optimal for learning
 Algorithmic Learning Theory: 19th International Conference (ALT’ 2008), volume 5254 of Lecture
"... Abstract. This paper extends previous studies on learnability in nonacceptable numberings by considering the question: for which criteria which numberings are optimal, that is, for which numberings it holds that one can learn every learnable class using the given numbering as hypothesis space. Furt ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. This paper extends previous studies on learnability in nonacceptable numberings by considering the question: for which criteria which numberings are optimal, that is, for which numberings it holds that one can learn every learnable class using the given numbering as hypothesis space. Furthermore an effective version of optimality is studied as well. It is shown that the effectively optimal numberings for finite learning are just the acceptable numberings. In contrast to this, there are nonacceptable numberings which are optimal for finite learning and effectively optimal for explanatory, vacillatory and behaviourally correct learning. The numberings effectively optimal for explanatory learning are the Kacceptable numberings. A similar characterization is obtained for the numberings which are effectively optimal for vacillatory learning. Furthermore, it is studied which numberings are optimal for one and not for another criterion: among the criteria of finite, explanatory, vacillatory and behaviourally correct learning all separations can be obtained; however every numbering which is optimal for explanatory learning is also optimal for consistent learning. 1
Enumerations of Π 0 1 Classes: Acceptability and Decidable Classes
, 2006
"... A Π0 1 class is an effectively closed set of reals. One way to view it is as the set of infinite paths through a computable tree. We consider the notion of acceptably equivalent numberings of Π0 1 classes. We show that a permutation exists between any two acceptably equivalent numberings that preser ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
A Π0 1 class is an effectively closed set of reals. One way to view it is as the set of infinite paths through a computable tree. We consider the notion of acceptably equivalent numberings of Π0 1 classes. We show that a permutation exists between any two acceptably equivalent numberings that preserves the computable content. Furthermore the most commonly used numberings of the Π0 1 classes are acceptably equivalent. We also consider decidable Π01 classes in enumerations. A decidable Π0 1 class may be represented by a unique computable tree without dead ends, but we show that this tree may not show up in an enumeration of uniformly computable trees which gives rise to all Π0 1 classes. In fact this is guaranteed to occur for some decidable Π0 1 class. These results are motivated by structural questions concerning the upper semilattice of enumerations of Π0 1 classes where notions such as acceptable equivalence arise. 1
Consistent Partial Identification
"... This study contrasts consistent partial identification with learning in the limit. Here partial identification means that the learner outputs an infinite sequence of conjectures in which one correct hypothesis occurs infinitely often and all other hypotheses occur only finitely often. Consistency me ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This study contrasts consistent partial identification with learning in the limit. Here partial identification means that the learner outputs an infinite sequence of conjectures in which one correct hypothesis occurs infinitely often and all other hypotheses occur only finitely often. Consistency means that every conjecture is correct on all the data seen so far. Learning in the limit means that the learner outputs from some time on always the same correct hypothesis. As the class of all totalrecursive functions can be partially identified, the constraint of consistency has to be added to make a meaningful comparison to learning in the limit. For the version of consistency where the learner has to be defined and consistent on all inputs, it is shown that the power of the learning criterion depends on whether the function to be learnt is fed in canonical order or in arbitrary order. In the first case, consistent partial identification is incomparable to learning in the limit; in the second case, it is equivalent to consistent learning in the limit with arbitrarily fed input. Furthermore, the inference degrees of these criteria are investigated. For the case where the function is fed in canonical order, there are just two inference degrees: the trivial one which contains all oracles of hyperimmune free Turing degree and the omniscient one which contains all oracles of hyperimmune Turing degree. In the case that the function is fed in arbitrary order, the picture is more complicated and the omniscient inference degree contains exactly all oracles of high Turing degree. 1
Effectively Closed Sets and Enumerations
, 2007
"... An effectively closed set, or Π 0 1 class, may viewed as the set of infinite paths through a computable tree. A numbering, or enumeration, is a map from ω onto a countable collection of objects. One numbering is reducible to another if equality holds after the second is composed with a computable fu ..."
Abstract
 Add to MetaCart
An effectively closed set, or Π 0 1 class, may viewed as the set of infinite paths through a computable tree. A numbering, or enumeration, is a map from ω onto a countable collection of objects. One numbering is reducible to another if equality holds after the second is composed with a computable function. Many commonly used numberings of Π 0 1 classes are shown to be mutually reducible via a computable permutation. Computable injective numberings are given for the family of Π 0 1 classes and for the subclasses of decidable and of homogeneous Π 0 1 classes. However no computable numberings exist for small or thin classes. No computable numbering of trees exists that includes all computable trees without dead ends. 1