Results

**1 - 1**of**1**### The Hardness Results of Actively Predicting Simple Subclasses of Context-Free Grammars

"... In this paper, we present the hardness results of actively predicting context-free grammars that the number of nonterminals is just one, that is sequential, that is properly sequential, and that the number of nonterminals appearing in the righthand side of each production is bounded by some constant ..."

Abstract
- Add to MetaCart

In this paper, we present the hardness results of actively predicting context-free grammars that the number of nonterminals is just one, that is sequential, that is properly sequential, and that the number of nonterminals appearing in the righthand side of each production is bounded by some constant. keywords: computational learning theory, prediction, grammatical inference, formal language 1 Introduction The task of predicting the classification of a new example is frequently discussed from the viewpoints of both passive and active settings. In a passive setting, the examples are all chosen independently according to a fixed but unknown probability distribution, and the learner has no control over selection of examples [5, 6]. In an active setting, on the other hand, the learner is allowed to ask about particular examples, that is, the learner makes membership queries , before the new example to predict is given to the learner [1, 3]. Concerned with language learning, we can design...