Results 1  10
of
27
The minimum consistent DFA problem cannot be approximated within any polynomial
 Journal of the Association for Computing Machinery
, 1993
"... Abstract. The minimum consistent DFA problem is that of finding a DFA with as few states as possible that is consistent with a given sample (a finite collection of words, each labeled as to whether the DFA found should accept or reject). Assuming that P # NP, it is shown that for any constant k, no ..."
Abstract

Cited by 99 (4 self)
 Add to MetaCart
(Show Context)
Abstract. The minimum consistent DFA problem is that of finding a DFA with as few states as possible that is consistent with a given sample (a finite collection of words, each labeled as to whether the DFA found should accept or reject). Assuming that P # NP, it is shown that for any constant k, no polynomialtime algorithm can be guaranteed to find a consistent DFA with fewer than opt ~ states, where opt is the number of states in the minimum state DFA consistent with the sample. This result holds even if the alphabet is of constant size two, and if the algorithm is allowed to produce an NFA, a regular expression, or a regular grammar that is consistent with the sample. A similar nonapproximability result is presented for the problem of finding small consistent linear grammars. For the case of finding minimum consistent DFAs when the alphabet is not of constant size but instead is allowed to vay with the problem specification, the slightly
On Fault Coverage of Tests for Finite State Specifications
 Computer Networks and ISDN Systems
, 1996
"... Testing is a tradeoff between increased confidence in the correctness of the implementation under test and constraints on the amount of time and effort that can be spent in testing. Therefore, the coverage, or adequacy of the test suite, becomes a very important issue. In this paper, we analyze bas ..."
Abstract

Cited by 26 (9 self)
 Add to MetaCart
(Show Context)
Testing is a tradeoff between increased confidence in the correctness of the implementation under test and constraints on the amount of time and effort that can be spent in testing. Therefore, the coverage, or adequacy of the test suite, becomes a very important issue. In this paper, we analyze basic ideas underlying the techniques for fault coverage analysis and assurance mainly developed in the context of protocol conformance testing based on finite state models. Special attention is paid to parameters which determine the testability of a given specification and influence the length of a test suite which guarantees complete fault coverage. We also point out certain issues which need further study. 1. INTRODUCTION Testing is a critical phase in the development life cycle for any hardware and software system, and in particular for communication systems for which it ensures the conformance to the relevant standards and compatibility or interoperability with other systems. Testing con...
Minimal CoverAutomata for Finite Languages
 Proceedings of the Third International Workshop on Implementing Automata (WIA'98
, 1999
"... A coverautomaton A of a finite language L ` \Sigma is a finite deterministic automaton (DFA) that accepts all words in L and possibly other words that are longer than any word in L. A minimal deterministic finite cover automaton (DFCA) of a finite language L usually has a smaller size than a min ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
A coverautomaton A of a finite language L ` \Sigma is a finite deterministic automaton (DFA) that accepts all words in L and possibly other words that are longer than any word in L. A minimal deterministic finite cover automaton (DFCA) of a finite language L usually has a smaller size than a minimal DFA that accept L. Thus, cover automata can be used to reduce the size of the representations of finite languages in practice. In this paper, we describe an efficient algorithm that, for a given DFA accepting a finite language, constructs a minimal deterministic finite coverautomaton of the language. We also give algorithms for the boolean operations on deterministic cover automata, i.e., on the finite languages they represent. Key words: Finite languages,Deterministic Finite Automata,Cover Language,Deterministic Cover Automata 1 Introduction Regular languages and finite automata are widely used in many areas such as lexical analysis, string matching, circuit testing, image compression,...
Learning deterministic finite automata with a smart state labelling evolutionary algorithm
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2005
"... Abstract—Learning a Deterministic Finite Automaton (DFA) from a training set of labeled strings is a hard task that has been much studied within the machine learning community. It is equivalent to learning a regular language by example and has applications in language modeling. In this paper, we des ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
(Show Context)
Abstract—Learning a Deterministic Finite Automaton (DFA) from a training set of labeled strings is a hard task that has been much studied within the machine learning community. It is equivalent to learning a regular language by example and has applications in language modeling. In this paper, we describe a novel evolutionary method for learning DFA that evolves only the transition matrix and uses a simple deterministic procedure to optimally assign state labels. We compare its performance with the Evidence Driven State Merging (EDSM) algorithm, one of the most powerful known DFA learning algorithms. We present results on random DFA induction problems of varying target size and training set density. We also study the effects of noisy training data on the evolutionary approach and on EDSM. On noisefree data, we find that our evolutionary method outperforms EDSM on small sparse data sets. In the case of noisy training data, we find that our evolutionary method consistently outperforms EDSM, as well as other significant methods submitted to two recent competitions. Index Terms—Grammatical inference, finite state automata, random hill climber, evolutionary algorithm. 1
Automata over Continuous Time
 Theoretical Computer Science
, 1998
"... The principal objective of this paper is to lift basic concepts of the classical automata theory from discrete to continuous (real) time. It is argued that the set of nite memory retrospective functions is the set of functions realized by nite state devices. We show that the nite memory retros ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
(Show Context)
The principal objective of this paper is to lift basic concepts of the classical automata theory from discrete to continuous (real) time. It is argued that the set of nite memory retrospective functions is the set of functions realized by nite state devices. We show that the nite memory retrospective functions are speedindependent, i.e., they are invariant under `stretchings' of the time axis. Therefore, such functions cannot deal with metrical aspects of the reals.
Efficient Search Techniques for the Inference of Minimum Size Finite Automata
 In Proceedings of the 1998 South American Symposium on String Processing and Information Retrieval, Santa Cruz de La Sierra
, 1998
"... We propose a new algorithm for the inference of the minimum size deterministic automaton consistent with a prespecified set of input/output strings. Our approach improves a well known search algorithm proposed by Bierman, by incorporating a set of techniques known as dependency directed backtracking ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
We propose a new algorithm for the inference of the minimum size deterministic automaton consistent with a prespecified set of input/output strings. Our approach improves a well known search algorithm proposed by Bierman, by incorporating a set of techniques known as dependency directed backtracking. These techniques have already been used in other applications, but we are the first to apply them to this problem. The results show that the application of these techniques yields an algorithm that is, for the problems studied, orders of magnitude faster than existing approaches.
From church and prior to psl
 In O. Grumberg & H. Veith (Eds.), 25 Years of Model Checking
, 2008
"... Abstract. One of the surprising developments in the area of program verification is how ideas introduced originally by logicians in the 1950s ended up yielding by 2003 an industrialstandard propertyspecification language called PSL. This development was enabled by the equally unlikely transformati ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
Abstract. One of the surprising developments in the area of program verification is how ideas introduced originally by logicians in the 1950s ended up yielding by 2003 an industrialstandard propertyspecification language called PSL. This development was enabled by the equally unlikely transformation of the mathematical machinery of automata on infinite words, introduced in the early 1960s for secondorder arithmetics, into effective algorithms for modelchecking tools. This paper attempts to trace the tangled threads of this development.
Pattern discovery in time series, part I: Theory, algorithm, analysis, and convergence
, 2002
"... We present a new algorithm for discovering patterns in time series and other sequential data. We exhibit a reliable procedure for building the minimal set of hidden, Markovian states that is statistically capable of producing the behavior exhibited in the data — the underlying process’s causal stat ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
(Show Context)
We present a new algorithm for discovering patterns in time series and other sequential data. We exhibit a reliable procedure for building the minimal set of hidden, Markovian states that is statistically capable of producing the behavior exhibited in the data — the underlying process’s causal states. Unlike conventional methods for fitting hidden Markov models (HMMs) to data, our algorithm makes no assumptions about the process’s causal architecture (the number of hidden states and their transition structure), but rather infers it from the data. It starts with assumptions of minimal structure and introduces complexity only when the data demand it. Moreover, the causal states it infers have important predictive optimality properties that conventional HMM states lack. Here, in Part I, we introduce the algorithm, review the theory behind it, prove its asymptotic reliability, and use large deviation theory to estimate its rate of convergence. In the sequel, Part II, we outline the algorithm’s implementation, illustrate its ability to discover even “difficult” patterns, and compare it to various alternative schemes.
From Philosophical to Industrial Logics
"... One of the surprising developments in the area of program verification is how ideas introduced by logicians in the early part of the 20th Century ended up yielding by the 21 Century industrialstandard propertyspecification languages. This development was enabled by the equally unlikely transformat ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
One of the surprising developments in the area of program verification is how ideas introduced by logicians in the early part of the 20th Century ended up yielding by the 21 Century industrialstandard propertyspecification languages. This development was enabled by the equally unlikely transformation of the mathematical machinery of automata on infinite words, introduced in the early 1960s for secondorder logic, into effective algorithms for modelchecking tools. This paper attempts to trace the tangled threads of this development.
On the Decidability of Continuous Time Specification Formalisms
 Journal of Logic and Computation
, 1998
"... We consider an interpretation of monadic secondorder logic of order in the continuous time structure of finitely variable signals and show the decidability of monadic logic in this structure. The expressive power of monadic logic is illustrated by providing a straightforward meaning preserving tran ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
We consider an interpretation of monadic secondorder logic of order in the continuous time structure of finitely variable signals and show the decidability of monadic logic in this structure. The expressive power of monadic logic is illustrated by providing a straightforward meaning preserving translation into monadic logic of three typical continuous time specification formalism: Temporal Logic of Reals [2], Restricted Duration Calculus [4], and the Propositional fragment of Mean Value Calculus [6]. As a byproduct of the decidability of monadic logic we obtain that the above formalisms are decidable even when extended by quantifiers. 1 Introduction In the recent years systems whose behavior change in the continuous (real) time were extensively investigated. Hybrid and control systems are prominent examples of real time systems. A number of formalisms for specification of real time behavior were suggested in the literature. Some of these formalisms (e.g., timed automata [1]) exten...