Results 1 
8 of
8
Ordinal Mind Change Complexity of Language Identification
"... The approach of ordinal mind change complexity, introduced by Freivalds and Smith, uses (notations for) constructive ordinals to bound the number of mind changes made by a learning machine. This approach provides a measure of the extent to which a learning machine has to keep revising its estimate o ..."
Abstract

Cited by 18 (6 self)
 Add to MetaCart
The approach of ordinal mind change complexity, introduced by Freivalds and Smith, uses (notations for) constructive ordinals to bound the number of mind changes made by a learning machine. This approach provides a measure of the extent to which a learning machine has to keep revising its estimate of the number of mind changes it will make before converging to a correct hypothesis for languages in the class being learned. Recently, this notion, which also yields a measure for the difficulty of learning a class of languages, has been used to analyze the learnability of rich concept classes. The present paper further investigates the utility of ordinal mind change complexity. It is shown that for identification from both positive and negative data and n ≥ 1, the ordinal mind change complexity of the class of languages formed by unions of up to n + 1 pattern languages is only ω ×O notn(n) (where notn(n) is a notation for n, ω is a notation for the least limit ordinal and ×O represents ordinal multiplication). This result nicely extends an observation of Lange and Zeugmann
On Learning Unions of Pattern Languages and Tree Patterns
, 1999
"... We present efficient online algorithms for learning unions of a constant number of tree patterns, unions of a constant number of onevariable pattern languages, and unions of a constant number of pattern languages with fixed length substitutions. By fixed length substitutions we mean that each occur ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
We present efficient online algorithms for learning unions of a constant number of tree patterns, unions of a constant number of onevariable pattern languages, and unions of a constant number of pattern languages with fixed length substitutions. By fixed length substitutions we mean that each occurence of variable x i must be substituted by terminal strings of fixed length l(x i ). We prove that if an arbitrary unions of pattern languages with fixed length substitutions can be learned efficiently then DNFs are efficiently learnable in the mistake bound model. Since we use a reduction to Winnow, our algorithms are robust against attribute noise. Furthermore, they can be modified to handle concept drift. Also, our approach is quite general and may be applicable to learning other pattern related classes. For example, we could learn a more general pattern language class in which a penalty (i.e. weight) is assigned to each violation of the rule that a terminal symbol cannot be changed ...
Inductive Inference with Procrastination: Back to Definitions
 Fundamenta Informaticae
, 1999
"... In this paper, we reconsider the denition of procrastinating learning machines. In the original denition of Freivalds and Smith [FS93], constructive ordinals are used to bound mindchanges. We investigate possibility of using arbitrary linearly ordered sets to bound mindchanges in similar way. It ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
In this paper, we reconsider the denition of procrastinating learning machines. In the original denition of Freivalds and Smith [FS93], constructive ordinals are used to bound mindchanges. We investigate possibility of using arbitrary linearly ordered sets to bound mindchanges in similar way. It turns out that using certain ordered sets it is possible to dene inductive inference types dierent from the previously known ones. We investigate properties of the new inductive inference types and compare them to other types. This research was supported by Latvian Science Council Grant No.93.599 and NSF Grant 9421640. Some of the results from this paper were presented earlier [AFS96]. y The third author was supported in part by NSF Grant 9301339. 1 Introduction We study inductive inference using the model developed by Gold [Gol67]. There is a well known hierarchy of larger and larger classes of learnable sets of phenomena based on the number of time a learning machine is all...
Learning Elementary Formal Systems with Queries
, 2000
"... An elementary formal system (EFS , for short) is a kind of logic program which directly manipulates character strings. A number of researches have investigated the ability of EFS as an uniform framework for language learning in various learning models including model inference, inductive inferenc ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
An elementary formal system (EFS , for short) is a kind of logic program which directly manipulates character strings. A number of researches have investigated the ability of EFS as an uniform framework for language learning in various learning models including model inference, inductive inference, and PAClearning. In this paper, we investigate the polynomial time learnability of EFS from the view of active learning allowing membership queries. Positive results include the polynomial time learnability of the class of terminating HEFS of variableoccurrence k and arity r from equivalence queries and entailment membership queries with the information on termination. We also presented a lower bound result showing that the algorithm is near optimal in the query complexity. Negative results include a series of representationindependent hardness results, which fill the gap between the learnable and the nonlearnable subclasses of EFS in our knowledge. Particularly, we showed th...
Parsimony Hierarchies for Inductive Inference
 Journal of Symbolic Logic
"... Freivalds defined an acceptable programming system independent criterion for learning programs for functions in which the final programs were required to be both correct and "nearly" minimal size, i.e, within a computable function of being purely minimal size. Kinber showed that this parsimony requi ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Freivalds defined an acceptable programming system independent criterion for learning programs for functions in which the final programs were required to be both correct and "nearly" minimal size, i.e, within a computable function of being purely minimal size. Kinber showed that this parsimony requirement on final programs limits learning power. However, in scientific inference, parsimony is considered highly desirable. A limcomputable function is (by definition) one calculable by a total procedure allowed to change its mind finitely many times about its output. Investigated is the possibility of assuaging somewhat the limitation on learning power resulting from requiring parsimonious final programs by use of criteria which require the final, correct programs to be "notsonearly" minimal size, e.g., to be within a limcomputable function of actual minimal size. It is shown that some parsimony in the final program is thereby retained, yet learning power strictly increases. Considered, then, are limcomputable functions as above but for which notations for constructive ordinals are used to bound the number of mind changes allowed regarding the output. This is a variant of an idea introduced by Freivalds and Smith. For this ordinal notation complexity bounded version of limcomputability, the power of the resultant learning criteria form finely graded, infinitely ramifying, infinite hierarchies intermediate between the computable and the limcomputable cases. Some of these hierarchies, for the natural notations determining them, are shown to be optimally tight.
On a generalized notion of mistake bounds
 Information and Computation
"... This paper proposes the use of constructive ordinals as mistake bounds in the online learning model. This approach elegantly generalizes the applicability of the online mistake bound model to learnability analysis of very expressive concept classes like pattern languages, unions of pattern languag ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
This paper proposes the use of constructive ordinals as mistake bounds in the online learning model. This approach elegantly generalizes the applicability of the online mistake bound model to learnability analysis of very expressive concept classes like pattern languages, unions of pattern languages, elementary formal systems, and minimal models of logic programs. The main result in the paper shows that the topological property of effective finite bounded thickness is a sufficient condition for online learnability with a certain ordinal mistake bound. An interesting characterization of the online learning model is shown in terms of the identification in the limit framework. It is established that the classes of languages learnable in the online model with a mistake bound of α are exactly the same as the classes of languages learnable in the limit from both positive and negative data by a Popperian, consistent learner with a mind change bound of α. This result nicely builds a bridge between the two models. 1
On the Learnability of Recursively Enumerable Languages from Good Examples
 Theoret. Comput. Sci
, 1997
"... The present paper investigates identification of indexed families of recursively enumerable languages from good examples. In the context of class preserving learning from good text examples, it is shown that the notions of finite and limit identification coincide. On the other hand, these two cri ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The present paper investigates identification of indexed families of recursively enumerable languages from good examples. In the context of class preserving learning from good text examples, it is shown that the notions of finite and limit identification coincide. On the other hand, these two criteria are different in the context of class comprising learning from good text examples. In the context of learning from good informant examples, finite and limit identification criteria differ for both class preserving and class comprising cases. The above results resolve an open question posed by Lange, Nessel and Wiehagen in a similar study about indexed families of recursive languages. 1 Introduction Consider the identification of formal languages from positive data. A machine is fed all the strings and no nonstrings of a language L, in any order, one string at a time. The machine, as it receives strings of L, outputs a sequence of grammars. The machine is said to identify L just ...
The Intrinsic Complexity of Learning: A Survey
, 2007
"... The theory of learning in the limit has been a focus of study by several researchers over the last three decades. There have been several suggestions on how to measure the complexity or hardness of learning. In this paper we survey the work done in one specific such measure, called intrinsic comple ..."
Abstract
 Add to MetaCart
The theory of learning in the limit has been a focus of study by several researchers over the last three decades. There have been several suggestions on how to measure the complexity or hardness of learning. In this paper we survey the work done in one specific such measure, called intrinsic complexity of learning. We will be mostly concentrating on learning languages, with only a brief look at function learning.