Results 1  10
of
20
Proving the correctness of reactive systems using sized types
, 1996
"... { rjmh, pareto, sabry We have designed and implemented a typebased analysis for proving some baaic properties of reactive systems. The analysis manipulates rich type expressions that contain information about the sizes of recursively defined data structures. Sized types are useful for detecting d ..."
Abstract

Cited by 148 (3 self)
 Add to MetaCart
(Show Context)
{ rjmh, pareto, sabry We have designed and implemented a typebased analysis for proving some baaic properties of reactive systems. The analysis manipulates rich type expressions that contain information about the sizes of recursively defined data structures. Sized types are useful for detecting deadlocks, nontermination, and other errors in embedded programs. To establish the soundness of the analysis we have developed an appropriate semantic model of sized types. 1 Embedded Functional Programs In a reactive system, the control software must continuously react to inputs from the environment. We distinguish a class of systems where the embedded programs can be naturally expressed as functional programs manipulating streams. This class of programs appears to be large enough for many purposes [2] and is the core of more expressive formalisms that accommodate asynchronous events, nondeterminism, etc. The fundamental criterion for the correctness of programs embedded in reactive systems is Jwene.ss. Indeed, before considering the properties of the output, we must ensure that there is some output in the first place: the program must continuous] y react to the input streams by producing elements on the output streams. This latter property may fail in various ways: e the computation of a stream element may depend on itself creating a “black hole, ” or e the computation of one of the output streams may demand elements from some input stream at different rates, which requires unbounded buffering, or o the computation of a stream element may exhaust the physical resources of the machine or even diverge.
The Power of Vacillation in Language Learning
, 1992
"... Some extensions are considered of Gold's influential model of language learning by machine from positive data. Studied are criteria of successful learning featuring convergence in the limit to vacillation between several alternative correct grammars. The main theorem of this paper is that there ..."
Abstract

Cited by 46 (13 self)
 Add to MetaCart
(Show Context)
Some extensions are considered of Gold's influential model of language learning by machine from positive data. Studied are criteria of successful learning featuring convergence in the limit to vacillation between several alternative correct grammars. The main theorem of this paper is that there are classes of languages that can be learned if convergence in the limit to up to (n+1) exactly correct grammars is allowed but which cannot be learned if convergence in the limit is to no more than n grammars, where the no more than n grammars can each make finitely many mistakes. This contrasts sharply with results of Barzdin and Podnieks and, later, Case and Smith, for learnability from both positive and negative data. A subset principle from a 1980 paper of Angluin is extended to the vacillatory and other criteria of this paper. This principle, provides a necessary condition for circumventing overgeneralization in learning from positive data. It is applied to prove another theorem to the eff...
Infinitary Self Reference in Learning Theory
, 1994
"... Kleene's Second Recursion Theorem provides a means for transforming any program p into a program e(p) which first creates a quiescent self copy and then runs p on that self copy together with any externally given input. e(p), in effect, has complete (low level) self knowledge, and p represents ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
Kleene's Second Recursion Theorem provides a means for transforming any program p into a program e(p) which first creates a quiescent self copy and then runs p on that self copy together with any externally given input. e(p), in effect, has complete (low level) self knowledge, and p represents how e(p) uses its self knowledge (and its knowledge of the external world). Infinite regress is not required since e(p) creates its self copy outside of itself. One mechanism to achieve this creation is a self replication trick isomorphic to that employed by singlecelled organisms. Another is for e(p) to look in a mirror to see which program it is. In 1974 the author published an infinitary generalization of Kleene's theorem which he called the Operator Recursion Theorem. It provides a means for obtaining an (algorithmically) growing collection of programs which, in effect, share a common (also growing) mirror from which they can obtain complete low level models of themselves and the other prog...
Synthesizing Enumeration Techniques For Language Learning
 In Proceedings of the Ninth Annual Conference on Computational Learning Theory
, 1996
"... this paper we assume, without loss of generality, that for all oe ` ø , [M(oe) 6=?] ) [M(ø) 6=?]. ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
this paper we assume, without loss of generality, that for all oe ` ø , [M(oe) 6=?] ) [M(ø) 6=?].
Synthesizing Learners Tolerating Computable Noisy Data
 In Proc. 9th International Workshop on Algorithmic Learning Theory, Lecture
, 1998
"... An index for an r.e. class of languages (by definition) generates a sequence of grammars defining the class. An index for an indexed family of languages (by definition) generates a sequence of decision procedures defining the family. F. Stephan's model of noisy data is employed, in which, rough ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
An index for an r.e. class of languages (by definition) generates a sequence of grammars defining the class. An index for an indexed family of languages (by definition) generates a sequence of decision procedures defining the family. F. Stephan's model of noisy data is employed, in which, roughly, correct data crops up infinitely often, and incorrect data only finitely often. In a completely computable universe, all data sequences, even noisy ones, are computable. New to the present paper is the restriction that noisy data sequences be, nonetheless, computable! Studied, then, is the synthesis from indices for r.e. classes and for indexed families of languages of various kinds of noisetolerant languagelearners for the corresponding classes or families indexed, where the noisy input data sequences are restricted to being computable. Many positive results, as well as some negative results, are presented regarding the existence of such synthesizers. The main positive result is surpris...
String extension learning using lattices
 In Henning Fernau AdrianHoria Dediu and Carlos MartínVide, editors, Proceedings of the 4th International Conference on Language and Automata Theory and Applications (LATA 2010), volume 6031 of Lecture Notes in Computer Science
, 2010
"... Abstract. The class of regular languages is not identi able from positive data in Gold's language learning model. Many attempts have been made to de ne interesting classes that are learnable in this model, preferably with the associated learner having certain advantageous properties. Heinz &apo ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The class of regular languages is not identi able from positive data in Gold's language learning model. Many attempts have been made to de ne interesting classes that are learnable in this model, preferably with the associated learner having certain advantageous properties. Heinz '09 presents a set of language classes called String Extension (Learning) Classes, and shows it to have several desirable properties. In the present paper, we extend the notion of String Extension Classes by basing it on lattices and formally establish further useful properties resulting from this extension. Using lattices enables us to cover a larger range of language classes including the pattern languages, as well as to give various ways of characterizing String Extension Classes and its learners. We believe this paper to show that String Extension Classes are learnable in a very natural way, and thus worthy of further study. 1
Machine induction without revolutionary changes in hypothesis size
 Information and Computation
, 1996
"... This paper provides a beginning study of the effects on inductive inference of paradigm shifts whose absence is approximately modeled by various formal approaches to forbidding large changes in the size of programs conjectured. One approach, called severely parsimonious, requires all the programs co ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
This paper provides a beginning study of the effects on inductive inference of paradigm shifts whose absence is approximately modeled by various formal approaches to forbidding large changes in the size of programs conjectured. One approach, called severely parsimonious, requires all the programs conjectured on the way to success to be nearly (i.e., within a recursive function of) minimal size. It is shown that this very conservative constraint allows learning infinite classes of functions, but not infinite r.e. classes of functions. Another approach, called nonrevolutionary, requires all conjectures to be nearly the same size as one another. This quite conservative constraint is, nonetheless, shown to permit learning some infinite r.e. classes of functions. Allowing up to one extra bounded size mind change towards a final program learned certainly doesn’t appear revolutionary. However, somewhat surprisingly for scientific (inductive) inference, it is shown that there are classes learnable with the nonrevolutionary constraint (respectively, with severe parsimony), up to (i + 1) mind changes, and no anomalies, which classes cannot be learned with no size constraint, an unbounded, finite number of anomalies in the final program, but with no more than i mind changes. Hence, in some cases, the possibility of one extra mind change is considerably more liberating than removal of very conservative size shift constraints. The proofs of these results are also combinatorially interesting. 1
Learning via Finitely Many Queries
"... This work introduces a new query inference model that can access data and communicate with a teacher by asking finitely many boolean queries in a language L. In this model the parameters of interest are the number of queries used and the expressive power of L. We study how the learning power varies ..."
Abstract
 Add to MetaCart
This work introduces a new query inference model that can access data and communicate with a teacher by asking finitely many boolean queries in a language L. In this model the parameters of interest are the number of queries used and the expressive power of L. We study how the learning power varies with these parameters. Preliminary results suggest that this model can help studying query inference in an resource bounded environment.
On a Quantitative Notion of Uniformity (Extended Abstract)
 In Mathematical Foundations of Computer Science, volume 969 of LNCS
, 1995
"... ) Appeared In: MFCS'95, LNCS 969, 169178, SpringerVerlag, 1995. Susanne Kaufmann ?? , Martin Kummer Institut fur Logik, Komplexitat und Deduktionssysteme, Universitat Karlsruhe, D76128 Karlsruhe, Germany. fkaufmann; kummerg@ira.uka.de 1 Introduction Recent work on "Bounded Que ..."
Abstract
 Add to MetaCart
) Appeared In: MFCS'95, LNCS 969, 169178, SpringerVerlag, 1995. Susanne Kaufmann ?? , Martin Kummer Institut fur Logik, Komplexitat und Deduktionssysteme, Universitat Karlsruhe, D76128 Karlsruhe, Germany. fkaufmann; kummerg@ira.uka.de 1 Introduction Recent work on "Bounded Query Classes" in complexity theory and computability theory (see [5] for a survey) has sparked renewed interest in quantitative aspects of computability theory. A central result in this field is the Nonspeedup Theorem [1, Theorem 9], which states that if any 2 n parallel queries to a set A can be computed with n sequential queries to some oracle, then A must be recursive. More formally, for A ` ! (! is the set of all natural numbers) and fixed n 1, let C A n (x 1 ; : : : ; xn ) = (A (x 1 ); : : : ; A (x n )) denote the nfold characteristic function of A. If there is an oracle B ` ! such that C A 2 n can be computed with n sequential queries to B, then A is recursive. (The bound on the number of...