Results 1 
5 of
5
Infinitary Self Reference in Learning Theory
, 1994
"... Kleene's Second Recursion Theorem provides a means for transforming any program p into a program e(p) which first creates a quiescent self copy and then runs p on that self copy together with any externally given input. e(p), in effect, has complete (low level) self knowledge, and p represents how ..."
Abstract

Cited by 18 (6 self)
 Add to MetaCart
Kleene's Second Recursion Theorem provides a means for transforming any program p into a program e(p) which first creates a quiescent self copy and then runs p on that self copy together with any externally given input. e(p), in effect, has complete (low level) self knowledge, and p represents how e(p) uses its self knowledge (and its knowledge of the external world). Infinite regress is not required since e(p) creates its self copy outside of itself. One mechanism to achieve this creation is a self replication trick isomorphic to that employed by singlecelled organisms. Another is for e(p) to look in a mirror to see which program it is. In 1974 the author published an infinitary generalization of Kleene's theorem which he called the Operator Recursion Theorem. It provides a means for obtaining an (algorithmically) growing collection of programs which, in effect, share a common (also growing) mirror from which they can obtain complete low level models of themselves and the other prog...
A Guided Tour of Minimal Indices and Shortest Descriptions
 Archives for Mathematical Logic
, 1997
"... The set of minimal indices of a G#del numbering ' is deøned as MIN' = fe : (8i ! e)[' i 6= 'e ]g. It has been known since 1972 that MIN' jT ; 00 , but beyond this MIN' has remained mostly uninvestigated. This thesis collects the scarce results on MIN' from the literature and adds some new observa ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
The set of minimal indices of a G#del numbering ' is deøned as MIN' = fe : (8i ! e)[' i 6= 'e ]g. It has been known since 1972 that MIN' jT ; 00 , but beyond this MIN' has remained mostly uninvestigated. This thesis collects the scarce results on MIN' from the literature and adds some new observations including that MIN' is autoreducible, but neither regressive nor (1; 2) computable. We also study several variants of MIN' that have been deøned in the literature like sizeminimal indices, shortest descriptions, and minimal indices of decision tables. Some challenging open problems are left for the adventurous reader. 1 Introduction How long is the shortest program that solves your problem? There are at least two ways to interpret this question depending on the type of problem involved. If the program's task is to output one speciøc object, we are looking for a shortest description of that object. This interpretation is closely related to Kolmogorov complexity. Although we have sev...
Machine induction without revolutionary changes in hypothesis size
 Information and Computation
, 1996
"... This paper provides a beginning study of the effects on inductive inference of paradigm shifts whose absence is approximately modeled by various formal approaches to forbidding large changes in the size of programs conjectured. One approach, called severely parsimonious, requires all the programs co ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
This paper provides a beginning study of the effects on inductive inference of paradigm shifts whose absence is approximately modeled by various formal approaches to forbidding large changes in the size of programs conjectured. One approach, called severely parsimonious, requires all the programs conjectured on the way to success to be nearly (i.e., within a recursive function of) minimal size. It is shown that this very conservative constraint allows learning infinite classes of functions, but not infinite r.e. classes of functions. Another approach, called nonrevolutionary, requires all conjectures to be nearly the same size as one another. This quite conservative constraint is, nonetheless, shown to permit learning some infinite r.e. classes of functions. Allowing up to one extra bounded size mind change towards a final program learned certainly doesn’t appear revolutionary. However, somewhat surprisingly for scientific (inductive) inference, it is shown that there are classes learnable with the nonrevolutionary constraint (respectively, with severe parsimony), up to (i + 1) mind changes, and no anomalies, which classes cannot be learned with no size constraint, an unbounded, finite number of anomalies in the final program, but with no more than i mind changes. Hence, in some cases, the possibility of one extra mind change is considerably more liberating than removal of very conservative size shift constraints. The proofs of these results are also combinatorially interesting. 1
Effectivizing Inseparability
, 1991
"... Smullyan's notion of effectively inseparable pairs of sets is not the best effective /constructive analog of Kleene's notion of pairs of sets inseparable by a recursive set. We present a corrected notion of effectively inseparable pairs of sets, prove a characterization of our notion, and show that ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Smullyan's notion of effectively inseparable pairs of sets is not the best effective /constructive analog of Kleene's notion of pairs of sets inseparable by a recursive set. We present a corrected notion of effectively inseparable pairs of sets, prove a characterization of our notion, and show that the pairs of index sets effectively inseparable in Smullyan's sense are the same as those effectively inseparable in ours. In fact we characterize the pairs of index sets effectively inseparable in either sense thereby generalizing Rice's Theorem. For subrecursive index sets we have sufficient conditions for various inseparabilities to hold. For inseparability by sets in the same subrecursive class we have a characterization. The latter essentially generalizes Kozen's (and Royer's later) Subrecursive Rice Theorem, and the proof of each result about subrecursive index sets is presented "Rogers style" with care to observe subrecursive restrictions. There are pairs of sets effectively inseparab...
Relations Between Two Types of Memory in Inductive Inference
"... . We consider inductive inference with limited memory[2]. We show that there exists a set U of total recursive functions such that { U can be learned with linear longterm memory (and no shortterm memory); { U can be learned with logarithmic longterm memory (and some amount of shortterm memo ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
. We consider inductive inference with limited memory[2]. We show that there exists a set U of total recursive functions such that { U can be learned with linear longterm memory (and no shortterm memory); { U can be learned with logarithmic longterm memory (and some amount of shortterm memory); { if U is learned with sublinear longterm memory, then the shortterm memory exceeds arbitrary recursive function. Thus an open problem posed by Freivalds, Kinber and Smith[2] is solved. To prove our result, we use Kolmogorov complexity. 1 Introduction There are two kinds of complexity in inductive inference (and learning in general) : { the complexity of computations necessary for learning; { the complexity of learning itself; There are some complexity measures that better reect the complexity of computations and some measures that better reect the complexity of learning. Several attempts to separate these two kinds of complexity have been made. For space (memory) complexi...