Results 1 
9 of
9
Infinitary Self Reference in Learning Theory
, 1994
"... Kleene's Second Recursion Theorem provides a means for transforming any program p into a program e(p) which first creates a quiescent self copy and then runs p on that self copy together with any externally given input. e(p), in effect, has complete (low level) self knowledge, and p represents how ..."
Abstract

Cited by 18 (6 self)
 Add to MetaCart
Kleene's Second Recursion Theorem provides a means for transforming any program p into a program e(p) which first creates a quiescent self copy and then runs p on that self copy together with any externally given input. e(p), in effect, has complete (low level) self knowledge, and p represents how e(p) uses its self knowledge (and its knowledge of the external world). Infinite regress is not required since e(p) creates its self copy outside of itself. One mechanism to achieve this creation is a self replication trick isomorphic to that employed by singlecelled organisms. Another is for e(p) to look in a mirror to see which program it is. In 1974 the author published an infinitary generalization of Kleene's theorem which he called the Operator Recursion Theorem. It provides a means for obtaining an (algorithmically) growing collection of programs which, in effect, share a common (also growing) mirror from which they can obtain complete low level models of themselves and the other prog...
Vacillatory and BC Learning on Noisy Data
 Theoretical Computer Science A
, 1996
"... this paper considers r.e. subsets L of N . We write ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
this paper considers r.e. subsets L of N . We write
Noisy Inference and Oracles
, 1996
"... A learner noisily infers a function or set, if every correct item is presented infinitely often while in addition some incorrect data ("noise") is presented a finite number of times. It is shown that learning from a noisy informant is equal to finite learning with Koracle from a usual informant. ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
A learner noisily infers a function or set, if every correct item is presented infinitely often while in addition some incorrect data ("noise") is presented a finite number of times. It is shown that learning from a noisy informant is equal to finite learning with Koracle from a usual informant. This result has several variants for learning from text and using different oracles. Furthermore, partial identification of all r.e. sets can cope also with noisy input.
Program synthesis in the presence of infinite number of inaccuracies
 in "Proceedings of the 5th Workshop on Algorithmic Learning Theory
, 1994
"... Most studies modeling inaccurate data in Gold style learning consider cases in which the number of inaccuracies is finite. The present paper argues that this approach is not reasonable for modeling inaccuracies in concepts that are infinite in nature (for example, graphs of computable functions). Th ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Most studies modeling inaccurate data in Gold style learning consider cases in which the number of inaccuracies is finite. The present paper argues that this approach is not reasonable for modeling inaccuracies in concepts that are infinite in nature (for example, graphs of computable functions). The effect of infinite number of inaccuracies in the input data in Gold’s model of learning is considered in the context of identification in the limit of computer programs from graphs of computable functions. Three kinds of inaccuracies, namely, noisy data, incomplete data, and imperfect data, are considered. The amount of each of these inaccuracies in the input is measured using certain density notions. A number of interesting hierarchy results are shown based on the densities of inaccuracies present in the input data. Several results establishing tradeoffs between the density and type of inaccuracies are also derived. 1 1
Learning from Multiple Sources of Inaccurate Data
 in "Proceedings of the International Workshop on Analogical and Inductive Inference in Dagstuhl
, 1992
"... Abstract. Most theoretical models of inductive inference make the idealized assumption that the data available to a learner is from a single and accurate source. The subject of inaccuracies in data emanating from a single source has been addressed by several authors. The present paper argues in favo ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Abstract. Most theoretical models of inductive inference make the idealized assumption that the data available to a learner is from a single and accurate source. The subject of inaccuracies in data emanating from a single source has been addressed by several authors. The present paper argues in favor of a more realistic learning model in which data emanates from multiple sources, some or all of which may be inaccurate. Three kinds of inaccuracies are considered: spurious data (modeled as noisy texts), missing data (modeled as incomplete texts), and a mixture of spurious and missing data (modeled as imperfect texts). Motivated by the above argument, the present paper introduces and theoretically analyzes a number of inference criteria in which a learning machine is fed data from multiple sources, some of which may be infected with inaccuracies. The learning situation modeled is the identification in the limit of programs from graphs of computable functions. The main parameters of the investigation are: kind of inaccuracy, total number of data sources, number of faulty data sources which produce data within an acceptable bound, and the bound on the number of errors allowed in the final hypothesis learned by the machine. Sufficient conditions are determined under which, for the same kind of inaccuracy, for the same
A Survey of Inductive Inference with an Emphasis on Queries
 Complexity, Logic, and Recursion Theory, number 187 in Lecture notes in Pure and Applied Mathematics Series
, 1997
"... this paper M 0 ; M 1 ; : : : is a standard list of all Turing machines, M ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
this paper M 0 ; M 1 ; : : : is a standard list of all Turing machines, M
Learning from Streams
"... Abstract. Learning from streams is a process in which a group of learners separately obtain information about the target to be learned, but they can communicate with each other in order to learn the target. We are interested in machine models for learning from streams and study its learning power (a ..."
Abstract
 Add to MetaCart
Abstract. Learning from streams is a process in which a group of learners separately obtain information about the target to be learned, but they can communicate with each other in order to learn the target. We are interested in machine models for learning from streams and study its learning power (as measured by the collection of learnable classes). We study how the power of learning from streams depends on the two parameters m and n, where n is the number of learners which track a single stream of input each and m is the number of learners (among the n learners) which have to find, in the limit, the right description of the target. We study for which combinations m, n and m ′ , n ′ the following inclusion holds: Every class learnable from streams with parameters m, n is also learnable from streams with parameters m ′ , n ′. For the learning of uniformly recursive classes, we get a full characterization which depends only on the ratio m; but for general classes the picture is more complin cated. Most of the noninclusions in team learning carry over to noninclusions with the same parameters in the case of learning from streams; but only few inclusions are preserved and some additional noninclusions hold. Besides this, we also relate learning from streams to various other closely related and wellstudied forms of learning: iterative learning from text, learning from incomplete text and learning from noisy text. 1
Inferring Descriptive Generalisations of Formal Languages
"... In the present paper, we introduce a variant of Goldstyle learners that is not required to infer precise descriptions of the languages in a class, but that must find descriptive patterns, i. e., optimal generalisations within a class of pattern languages. Our first main result characterises those i ..."
Abstract
 Add to MetaCart
In the present paper, we introduce a variant of Goldstyle learners that is not required to infer precise descriptions of the languages in a class, but that must find descriptive patterns, i. e., optimal generalisations within a class of pattern languages. Our first main result characterises those indexed families of recursive languages that can be inferred by such learners, and we demonstrate that this characterisation shows enlightening connections to Angluin’s corresponding result for exact inference. Using a notion of descriptiveness that is restricted to the natural subclass of terminalfree Epattern languages, we introduce a generic inference strategy, and our second main result characterises those classes of languages that can be generalised by this strategy. This characterisation demonstrates that there are major classes of languages that can be generalised in our model, but not be inferred by a normal Goldstyle learner. Our corresponding technical considerations lead to deep insights of intrinsic interest into combinatorial and algorithmic properties of pattern languages. 1