Results 1 
4 of
4
Inductive Inference with Procrastination: Back to Definitions
 Fundamenta Informaticae
, 1999
"... In this paper, we reconsider the denition of procrastinating learning machines. In the original denition of Freivalds and Smith [FS93], constructive ordinals are used to bound mindchanges. We investigate possibility of using arbitrary linearly ordered sets to bound mindchanges in similar way. It ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
In this paper, we reconsider the denition of procrastinating learning machines. In the original denition of Freivalds and Smith [FS93], constructive ordinals are used to bound mindchanges. We investigate possibility of using arbitrary linearly ordered sets to bound mindchanges in similar way. It turns out that using certain ordered sets it is possible to dene inductive inference types dierent from the previously known ones. We investigate properties of the new inductive inference types and compare them to other types. This research was supported by Latvian Science Council Grant No.93.599 and NSF Grant 9421640. Some of the results from this paper were presented earlier [AFS96]. y The third author was supported in part by NSF Grant 9301339. 1 Introduction We study inductive inference using the model developed by Gold [Gol67]. There is a well known hierarchy of larger and larger classes of learnable sets of phenomena based on the number of time a learning machine is all...
On a generalized notion of mistake bounds
 Information and Computation
"... This paper proposes the use of constructive ordinals as mistake bounds in the online learning model. This approach elegantly generalizes the applicability of the online mistake bound model to learnability analysis of very expressive concept classes like pattern languages, unions of pattern languag ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
This paper proposes the use of constructive ordinals as mistake bounds in the online learning model. This approach elegantly generalizes the applicability of the online mistake bound model to learnability analysis of very expressive concept classes like pattern languages, unions of pattern languages, elementary formal systems, and minimal models of logic programs. The main result in the paper shows that the topological property of effective finite bounded thickness is a sufficient condition for online learnability with a certain ordinal mistake bound. An interesting characterization of the online learning model is shown in terms of the identification in the limit framework. It is established that the classes of languages learnable in the online model with a mistake bound of α are exactly the same as the classes of languages learnable in the limit from both positive and negative data by a Popperian, consistent learner with a mind change bound of α. This result nicely builds a bridge between the two models. 1
Mind Change Complexity of Learning Logic Programs
"... The present paper motivates the study of mind change complexity for learning minimal models of lengthbounded logic programs. It establishes ordinal mind change complexity bounds for learnability of these classes both from positive facts and from positive and negative facts. Building on Angluin’s no ..."
Abstract
 Add to MetaCart
The present paper motivates the study of mind change complexity for learning minimal models of lengthbounded logic programs. It establishes ordinal mind change complexity bounds for learnability of these classes both from positive facts and from positive and negative facts. Building on Angluin’s notion of finite thickness and Wright’s work on finite elasticity, Shinohara defined the property of bounded finite thickness to give a sufficient condition for learnability of indexed families of computable languages from positive data. This paper shows that an effective version of Shinohara’s notion of bounded finite thickness gives sufficient conditions for learnability with ordinal mind change bound, both in the context of learnability from positive data and for learnability from complete (both positive and negative) data. Let ω be a notation for the first limit ordinal. Then, it is shown that if a language defining framework yields a uniformly decidable family of languages and has effective bounded finite thickness, then for each natural number m> 0, the class of languages defined by formal systems of length ≤ m: • is identifiable in the limit from positive data with a mind change bound of ω m; • is identifiable in the limit from both positive and negative data with an ordinal mind change bound of ω × m. The above sufficient conditions are employed to give an ordinal mind change bound for learnability of minimal models of various classes of lengthbounded Prolog programs, including Shapiro’s linear programs, Arimura and Shinohara’s depthbounded linearlycovering programs, and Krishna Rao’s depthbounded linearlymoded programs. It is also noted that the bound for learning from positive data is tight for the example classes considered.
Category, Measure, Inductive Inference: A Triality Theorem and its Applications
, 1997
"... The famous SierpinskiErdos Duality Theorem [Sie34b, Erd43] states, informally, that any theorem about effective measure 0 and/or first category sets is also true when all occurrences of "effective measure 0" are replaced by "first category" and vice versa. This powerful and nice ..."
Abstract
 Add to MetaCart
The famous SierpinskiErdos Duality Theorem [Sie34b, Erd43] states, informally, that any theorem about effective measure 0 and/or first category sets is also true when all occurrences of "effective measure 0" are replaced by "first category" and vice versa. This powerful and nice result shows that "measure" and "category" are equally useful notions neither of which can be preferred to the other one when making formal the intuitive notion "almost all sets." Effective versions of measure and category are used in recursive function theory and related areas, and resourcebounded versions of the same notions are used in Theory of Computation. Again they are dual in the same sense. We show that in the world of recursive functions there is a third equipotent notion dual to both measure and category. This new notion is related to learnability (also known as inductive inference or identifiability). We use the term "triality" to describe this threeparty duality. 1 Introduction Mathematicians h...