Results 1 
8 of
8
A Guided Tour Across the Boundaries of Learning Recursive Languages
 Lecture Notes in Artificial Intelligence
, 1994
"... The present paper deals with the learnability of indexed families of uniformly recursive languages from positive data as well as from both, positive and negative data. We consider the influence of various monotonicity constraints to the learning process, and provide a thorough study concerning the i ..."
Abstract

Cited by 56 (29 self)
 Add to MetaCart
The present paper deals with the learnability of indexed families of uniformly recursive languages from positive data as well as from both, positive and negative data. We consider the influence of various monotonicity constraints to the learning process, and provide a thorough study concerning the influence of several parameters. In particular, we present examples pointing to typical problems and solutions in the field. Then we provide a unifying framework for learning. Furthermore, we survey results concerning learnability in dependence on the hypothesis space, and concerning order independence. Moreover, new results dealing with the efficiency of learning are provided. First, we investigate the power of iterative learning algorithms. The second measure of efficiency studied is the number of mind changes a learning algorithm is allowed to perform. In this setting we consider the problem whether or not the monotonicity constraints introduced do influence the efficiency of learning algo...
Learning via Queries in ...
, 1992
"... We prove that the set of all recursive functions cannot be inferred using firstorder queries in the query language containing extra symbols [+; !]. The proof of this theorem involves a new decidability result about Presburger arithmetic which is of independent interest. Using our machinery, we ..."
Abstract

Cited by 35 (11 self)
 Add to MetaCart
We prove that the set of all recursive functions cannot be inferred using firstorder queries in the query language containing extra symbols [+; !]. The proof of this theorem involves a new decidability result about Presburger arithmetic which is of independent interest. Using our machinery, we show that the set of all primitive recursive functions cannot be inferred with a bounded number of mind changes, again using queries in [+; !]. Additionally, we resolve an open question in [7] about passive versus active learning. 1) Introduction This paper presents new results in the area of query inductive inference (introduced in [7]); in addition, there are results of interest in mathematical logic. Inductive inference is the study of inductive machine learning in a theoretical framework. In query inductive inference, we study the ability of a Query Inference Machine 1 Supported, in part, by NSF grants CCR 8803641 and 9020079. 2 Also with IBM Corporation, Application Solutions...
Learning Recursive Languages with Bounded Mind Changes
, 1993
"... In the present paper we study the learnability of enumerable families L of uniformly recursive languages in dependence on the number of allowed mind changes, i.e., with respect to a wellstudied measure of efficiency. We distinguish between exact learnability (L has to be inferred w.r.t. L) and cla ..."
Abstract

Cited by 13 (12 self)
 Add to MetaCart
In the present paper we study the learnability of enumerable families L of uniformly recursive languages in dependence on the number of allowed mind changes, i.e., with respect to a wellstudied measure of efficiency. We distinguish between exact learnability (L has to be inferred w.r.t. L) and class preserving learning (L has to be inferred w.r.t. some suitable chosen enumeration of all the languages from L) as well as between learning from positive and from both, positive and negative data. The measure of efficiency is applied to prove the superiority of class preserving learning algorithms over exact learning. In particular, we considerably improve results obtained previously and establish two infinite hierarchies. Furthermore, we separate exact and class preserving learning from positive data that avoids overgeneralization. Finally, language learning with a bounded number of mind changes is completely characterized in terms of recursively generable finite sets. These characterizat...
Language Learning with a Bounded Number of Mind Changes
, 1993
"... We study the learnability of enumerable families L of uniformly recursive languages in dependence on the number of allowed mind changes, i.e., with respect to a wellstudied measure of efficiency. We distinguish between exact learnability (L has to be inferred w.r.t. L) and class preserving learn ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
We study the learnability of enumerable families L of uniformly recursive languages in dependence on the number of allowed mind changes, i.e., with respect to a wellstudied measure of efficiency. We distinguish between exact learnability (L has to be inferred w.r.t. L) and class preserving learning (L has to be inferred w.r.t. some suitable chosen enumeration of all the languages from L) as well as between learning from positive and from both, positive and negative data. The measure of efficiency is applied to prove the superiority of class preserving learning algorithms over exact learning. We considerably improve results obtained previously and establish two infinite hierarchies. Furthermore, we separate exact and class preserving learning from positive data that avoids overgeneralization. Finally, language learning with a bounded number of mind changes is completely characterized in terms of recursively generable finite sets. These characterizations offer a new method to handle ...
A Survey of Inductive Inference with an Emphasis on Queries
 Complexity, Logic, and Recursion Theory, number 187 in Lecture notes in Pure and Applied Mathematics Series
, 1997
"... this paper M 0 ; M 1 ; : : : is a standard list of all Turing machines, M ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
this paper M 0 ; M 1 ; : : : is a standard list of all Turing machines, M
Trading Monotonicity Demands versus Efficiency
 Bull. Inf. Cybern
, 1995
"... The present paper deals with the learnability of indexed families L of uniformly recursive languages from positive data. We consider the influence of three monotonicity demands and their dual counterparts to the efficiency of the learning process. The efficiency of learning is measured in depend ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
The present paper deals with the learnability of indexed families L of uniformly recursive languages from positive data. We consider the influence of three monotonicity demands and their dual counterparts to the efficiency of the learning process. The efficiency of learning is measured in dependence on the number of mind changes a learning algorithm is allowed to perform. The three notions of (dual) monotonicity reflect different formalizations of the requirement that the learner has to produce better and better (specializations) generalizations when fed more and more data on the target concept.
A Guided Tour Across the Boundaries of . . .
"... The present paper deals with the learnability of indexed families of uniformly recursive languages from positive data as well as from both, positive and negative data. We consider the influence of various monotonicity constraints to the learning process, and provide a thorough study concerning the i ..."
Abstract
 Add to MetaCart
The present paper deals with the learnability of indexed families of uniformly recursive languages from positive data as well as from both, positive and negative data. We consider the influence of various monotonicity constraints to the learning process, and provide a thorough study concerning the influence of several parameters. In particular, we present examples pointing to typical problems and solutions in the field. Then we provide a unifying framework for learning. Furthermore, we survey results concerning learnability in dependence on the hypothesis space, and concerning order independence. Moreover, new results dealing with the efficiency of learning are provided. First, we investigate the power of iterative learning algorithms. The second measure of efficiency studied is the number of mind changes a learning algorithm is allowed to perform. In this setting we consider the problem whether or not the monotonicity constraints introduced do influence the efficiency of learning algorithms. The paper mainly emphasis to provide a comprehensive summary of results recently obtained, and of proof techniques developed. Finally, throughout our guided tour we discuss the question of what a natural language learning algorithm might look like.