Results 1 
4 of
4
PredictionPreserving Reducibility with Membership Queries on Formal Languages ⋆
"... Abstract. This paper presents the predictionpreserving reducibility with membership queries (pwmreducibility) on formal languages, in particular, simple CFGs and finite unions of regular pattern languages. For the former, we mainly show that DNF formulas are pwmreducible to CFGs that is sequentia ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. This paper presents the predictionpreserving reducibility with membership queries (pwmreducibility) on formal languages, in particular, simple CFGs and finite unions of regular pattern languages. For the former, we mainly show that DNF formulas are pwmreducible to CFGs that is sequential or that contains at most one nonterminal. For the latter, we show that both bounded finite unions of regular pattern languages and unbounded finite unions of substring pattern languages are pwmreducible to DFAs.
A Catalog for PredictionPreserving Reducibility with Membership Queries on Formal Languages
"... This paper presents several results of predictioneserving reducibility with membership queries (pwm educibility) on formal languages. We mainly deal with two kinds of concept classes, simple CFGs and finite unions of regular pattern languages. For the former, we show that DNF formulas are pwmred ..."
Abstract
 Add to MetaCart
(Show Context)
This paper presents several results of predictioneserving reducibility with membership queries (pwm educibility) on formal languages. We mainly deal with two kinds of concept classes, simple CFGs and finite unions of regular pattern languages. For the former, we show that DNF formulas are pwmreducible to CFGs that is sequential or that contains at most one nonterminal. For the latter, on the other hand, we show that both bounded finite unions of regular pattern languages and unbounded finite unions of substring pattern languages are pwmreducible to DFAs, while DNF formulas are pwmreducible to unbounded finite unions of regular pattern languages. 1
Learning Languages from Positive Data and a Finite Number of Queries
"... A computational model for learning languages in the limit from full positive data and a bounded number of queries to the teacher (oracle) is introduced and explored. Equivalence, superset, and subset queries are considered (for the latter one we consider also a variant when the learner tests every c ..."
Abstract
 Add to MetaCart
A computational model for learning languages in the limit from full positive data and a bounded number of queries to the teacher (oracle) is introduced and explored. Equivalence, superset, and subset queries are considered (for the latter one we consider also a variant when the learner tests every conjecture, but the number of negative answers is uniformly bounded). If the answer is negative, the teacher may provide a counterexample. We consider several types of counterexamples: arbitrary, least counterexamples, the ones whose size is bounded by the size of positive data seen so far, and no counterexamples. A number of hierarchies based on the number of queries (answers) and types of answers/counterexamples is established. Capabilities of learning with different types of queries are compared. In most cases, one or two queries of one type can sometimes do more than any bounded number of queries of another type. Still, surprisingly, a finite number of subset queries is sufficient to simulate the same number of equivalence queries when behaviourally correct learners do not receive counterexamples and may have unbounded number of errors in almost all conjectures. 1
The Hardness Results of Actively Predicting Simple Subclasses of ContextFree Grammars
"... In this paper, we present the hardness results of actively predicting contextfree grammars that the number of nonterminals is just one, that is sequential, that is properly sequential, and that the number of nonterminals appearing in the righthand side of each production is bounded by some constant ..."
Abstract
 Add to MetaCart
In this paper, we present the hardness results of actively predicting contextfree grammars that the number of nonterminals is just one, that is sequential, that is properly sequential, and that the number of nonterminals appearing in the righthand side of each production is bounded by some constant. keywords: computational learning theory, prediction, grammatical inference, formal language 1 Introduction The task of predicting the classification of a new example is frequently discussed from the viewpoints of both passive and active settings. In a passive setting, the examples are all chosen independently according to a fixed but unknown probability distribution, and the learner has no control over selection of examples [5, 6]. In an active setting, on the other hand, the learner is allowed to ask about particular examples, that is, the learner makes membership queries , before the new example to predict is given to the learner [1, 3]. Concerned with language learning, we can design...