Results 1  10
of
18
The Power of Vacillation in Language Learning
, 1992
"... Some extensions are considered of Gold's influential model of language learning by machine from positive data. Studied are criteria of successful learning featuring convergence in the limit to vacillation between several alternative correct grammars. The main theorem of this paper is that there are ..."
Abstract

Cited by 44 (11 self)
 Add to MetaCart
Some extensions are considered of Gold's influential model of language learning by machine from positive data. Studied are criteria of successful learning featuring convergence in the limit to vacillation between several alternative correct grammars. The main theorem of this paper is that there are classes of languages that can be learned if convergence in the limit to up to (n+1) exactly correct grammars is allowed but which cannot be learned if convergence in the limit is to no more than n grammars, where the no more than n grammars can each make finitely many mistakes. This contrasts sharply with results of Barzdin and Podnieks and, later, Case and Smith, for learnability from both positive and negative data. A subset principle from a 1980 paper of Angluin is extended to the vacillatory and other criteria of this paper. This principle, provides a necessary condition for circumventing overgeneralization in learning from positive data. It is applied to prove another theorem to the eff...
On the Intrinsic Complexity of Learning
 Information and Computation
, 1995
"... A new view of learning is presented. The basis of this view is a natural notion of reduction. We prove completeness and relative difficulty results. An infinite hierarchy of intrinsically more and more difficult to learn concepts is presented. Our results indicate that the complexity notion capt ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
A new view of learning is presented. The basis of this view is a natural notion of reduction. We prove completeness and relative difficulty results. An infinite hierarchy of intrinsically more and more difficult to learn concepts is presented. Our results indicate that the complexity notion captured by our new notion of reduction differs dramatically from the traditional studies of the complexity of the algorithms performing learning tasks. 2 1 Introduction Traditional studies of inductive inference have focused on illuminating various strata of learnability based on varying the definition of learnability. The research following the Valiant's PAC model [Val84] and Angluin's teacher/learner model [Ang88] paid very careful attention to calculating the complexity of the learning algorithm. We present a new view of learning, based on the notion of reduction, that captures a different perspective on learning complexity than all prior studies. Based on our prelimanary reports, Jain...
Large Limits to Software Estimation
, 2001
"... Algorithmic (KCS) complexity results can be interpreted as indicating some limits to software estimation. While these limits are abstract they nevertheless contradict enthusiastic claims occasionally made by commercial software estimation advocates. Specifically, if it is accepted that algorithmic c ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Algorithmic (KCS) complexity results can be interpreted as indicating some limits to software estimation. While these limits are abstract they nevertheless contradict enthusiastic claims occasionally made by commercial software estimation advocates. Specifically, if it is accepted that algorithmic complexity is an appropriate definition of the complexity of a programming project, then claims of purely objective estimation of project complexity, development time, and programmer productivity are necessarily incorrect.
On approximating realworld halting problems
 Reischuk (Eds.), Proc. FCT 2005, in: Lectures Notes Comput. Sci
, 2005
"... Abstract. No algorithm can of course solve the Halting Problem, that is, decide within finite time always correctly whether a given program halts on a certain given input. It might however be able to give correct answers for ‘most ’ instances and thus solve it at least approximately. Whether and how ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Abstract. No algorithm can of course solve the Halting Problem, that is, decide within finite time always correctly whether a given program halts on a certain given input. It might however be able to give correct answers for ‘most ’ instances and thus solve it at least approximately. Whether and how well such approximations are feasible highly depends on the underlying encodings and in particular the Gödelization (programming system) which in practice usually arises from some programming language. We consider BrainF*ck (BF), a simple yet Turingcomplete realworld programming language over an eight letter alphabet, and prove that the natural enumeration of its syntactically correct sources codes induces a both efficient and dense Gödelization in the sense of [Jakoby&Schindelhauer’99]. It follows that any algorithm M approximating the Halting Problem for BF errs on at least a constant fraction εM> 0 of all instances of size n for infinitely many n. Next we improve this result by showing that, in every dense Gödelization, this constant lower bound ε to be independent of M; while, the other hand, the Halting Problem does admit approximation up to arbitrary fraction δ> 0byan appropriate algorithm M δ handling instances of size n for infinitely many n. The last two results complement work by [Lynch’74]. 1
On the intrinsic complexity of learning recursive functions
 In Proceedings of the Twelfth Annual Conference on Computational Learning Theory
, 1999
"... The intrinsic complexity of learning compares the difficulty of learning classes of objects by using some reducibility notion. For several types of learning recursive functions, both natural complete classes are exhibited and necessary and sufficient conditions for completeness are derived. Informal ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
The intrinsic complexity of learning compares the difficulty of learning classes of objects by using some reducibility notion. For several types of learning recursive functions, both natural complete classes are exhibited and necessary and sufficient conditions for completeness are derived. Informally, a class is complete iff both its topological structure is highly complex while its algorithmic structure is easy. Some selfdescribing classes turn out to be complete. Furthermore, the structure of the intrinsic complexity is shown to be much richer than the structure of the mind change complexity, though in general, intrinsic complexity and mind change complexity can behave “orthogonally”. 1.
On the Role of Search for Learning from Examples
 Journal of Experimental and Theoretical Artificial Intelligence
"... Gold [Gol67] discovered a fundamental enumeration technique, the socalled identificationbyenumeration, a simple but powerful class of algorithms for learning from examples (inductive inference). We introduce a variety of more sophisticated (and more powerful) enumeration techniques and charac ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Gold [Gol67] discovered a fundamental enumeration technique, the socalled identificationbyenumeration, a simple but powerful class of algorithms for learning from examples (inductive inference). We introduce a variety of more sophisticated (and more powerful) enumeration techniques and characterize their power. We conclude with the thesis that enumeration techniques are even universal in that each solvable learning problem in inductive inference can be solved by an adequate enumeration technique. This thesis is technically motivated and discussed. Keywords: Learning from examples, learning by search, identification by enumeration, enumeration techniques. Role of Search 1 1 Introduction The role of search, for learning from examples, is examined in a theoretical setting. Gold's seminal paper [Gol67] on inductive inference introduced a simple but powerful learning technique which became known as identificationby enumeration. Identificationbyenumeration begins with an infi...
Classifying Predicates and Languages
, 1997
"... The present paper studies a particular collection of classification problems, i.e., the classification of recursive predicates and languages, for arriving at a deeper understanding of what classification really is. In particular, the classification of predicates and languages is compared with the c ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
The present paper studies a particular collection of classification problems, i.e., the classification of recursive predicates and languages, for arriving at a deeper understanding of what classification really is. In particular, the classification of predicates and languages is compared with the classification of arbitrary recursive functions and with their learnability. The investigation undertaken is refined by introducing classification within a resource bound resulting in a new hierarchy. Furthermore, a formalization of multiclassification is presented and completely characterized in terms of standard classification. Additionally, consistent classification is introduced and compared with both resource bounded classification and standard classification. Finally, the classification of families of languages that have attracted attention in learning theory is studied, too.
On Duality in Learning and the Selection of Learning Teams
 Information and Computation
, 1996
"... Previous work in inductive inference dealt mostly with finding one or several machines (IIMs) that successfully learn a collection of functions. Herein we start with a class of functions and consider the learner set of all IIMs that are successful at learning the given class. Applying this perspe ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Previous work in inductive inference dealt mostly with finding one or several machines (IIMs) that successfully learn a collection of functions. Herein we start with a class of functions and consider the learner set of all IIMs that are successful at learning the given class. Applying this perspective to the case of team inference leads to the notion of diversification for a class of functions. This enables us to distinguish between several flavors of IIMs all of which must be represented in a team learning the given class. 2 1 Introduction All current theoretical approaches to machine learning tend to focus on a particular machine or a collection of machines and then find the class of concepts which can be learned by these machines under certain constraints defining a criterion of successful learning [AS83, OSW86]. In this paper we investigate the dual problem: Given some set of concepts, which algorithms can learn all those concepts? From [AGS89] we know that in the theory ...
Measure, Category and Learning Theory
"... Measure and category (or rather, their recursion theoretical counterparts) have been used in Theoretical Computer Science to make precise the intuitive notion "for most of the recursive sets." We use the Supported in part by NSF Grant CCR 9253582. y Supported in part by Latvian Council of Scie ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Measure and category (or rather, their recursion theoretical counterparts) have been used in Theoretical Computer Science to make precise the intuitive notion "for most of the recursive sets." We use the Supported in part by NSF Grant CCR 9253582. y Supported in part by Latvian Council of Science Grant 93.599 and NSF Grant 9119540. z Supported in part by NSF Grant 9301339. x Supported in part by NSF Grants 9119540 and 9301339.  Supported by the Deutsche Forschungsgemeinschaft (DFG) Grant Me 672/41. notions of effective measure and category to discuss the relative sizes of inferrible sets, and their complements. We find that inferrible sets become large rather quickly in the standard hierarchies of learnability. On the other hand, the complements of the learnable sets are all large. 1 Introduction Determining the relative size of denumerable sets, and those with cardinality @ 1 , led mathematicians do develop the notions of measure and category [Oxt71]. Described in t...
The computational power of compiling C++
 BULLETIN OF THE EUROPEAN ASSOCIATION FOR THEORETICAL COMPUTER SCIENCE
, 2003
"... Using a C++ compiler, any partial recursive function can be computed at compile time. We show this by using the C++ template mechanism to define functions via primitive recursion, composition, and µrecursion. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Using a C++ compiler, any partial recursive function can be computed at compile time. We show this by using the C++ template mechanism to define functions via primitive recursion, composition, and µrecursion.