Results 1  10
of
49
An automatatheoretic approach to linear temporal logic
 Logics for Concurrency: Structure versus Automata, volume 1043 of Lecture Notes in Computer Science
, 1996
"... Abstract. The automatatheoretic approach to linear temporal logic uses the theory of automata as a unifying paradigm for program specification, verification, and synthesis. Both programs and specifications are in essence descriptions of computations. These computations can be viewed as words over s ..."
Abstract

Cited by 217 (23 self)
 Add to MetaCart
Abstract. The automatatheoretic approach to linear temporal logic uses the theory of automata as a unifying paradigm for program specification, verification, and synthesis. Both programs and specifications are in essence descriptions of computations. These computations can be viewed as words over some alphabet. Thus,programs and specificationscan be viewed as descriptions of languagesover some alphabet. The automatatheoretic perspective considers the relationships between programs and their specifications as relationships between languages.By translating programs and specifications to automata, questions about programs and their specifications can be reduced to questions about automata. More specifically, questions such as satisfiability of specifications and correctness of programs with respect to their specifications can be reduced to questions such as nonemptiness and containment of automata. Unlike classical automata theory, which focused on automata on finite words, the applications to program specification, verification, and synthesis, use automata on infinite words, since the computations in which we are interested are typically infinite. This paper provides an introduction to the theory of automata on infinite words and demonstrates its applications to program specification, verification, and synthesis. 1
Model Checking of Safety Properties
, 1999
"... Of special interest in formal verification are safety properties, which assert that the system always stays within some allowed region. Proof rules for the verification of safety properties have been developed in the proofbased approach to verification, making verification of safety properties simp ..."
Abstract

Cited by 101 (16 self)
 Add to MetaCart
Of special interest in formal verification are safety properties, which assert that the system always stays within some allowed region. Proof rules for the verification of safety properties have been developed in the proofbased approach to verification, making verification of safety properties simpler than verification of general properties. In this paper we consider model checking of safety properties. A computation that violates a general linear property reaches a bad cycle, which witnesses the violation of the property. Accordingly, current methods and tools for model checking of linear properties are based on a search for bad cycles. A symbolic implementation of such a search involves the calculation of a nested fixedpoint expression over the system's state space, and is often impossible. Every computation that violates a safety property has a finite prefix along which the property is violated. We use this fact in order to base model checking of safety properties on a search for ...
State Complexity of Regular Languages
 Journal of Automata, Languages and Combinatorics
, 2000
"... State complexity is a descriptive complexity measure for regular languages. We investigate the problems related to the state complexity of regular languages and their operations. In particular, we compare the state complexity results on regular languages with those on finite languages. ..."
Abstract

Cited by 33 (5 self)
 Add to MetaCart
State complexity is a descriptive complexity measure for regular languages. We investigate the problems related to the state complexity of regular languages and their operations. In particular, we compare the state complexity results on regular languages with those on finite languages.
Complexity Results for TwoWay and MultiPebble Automata and their Logics
 Theoretical Computer Science
, 1996
"... : Twoway and multipebble automata are considered (the latter appropriately restricted to accept only regular languages), and enriched with additional features, such as nondeterminism and concurrency. We investigate the succinctness of such machines, and the extent to which this succinctness carrie ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
: Twoway and multipebble automata are considered (the latter appropriately restricted to accept only regular languages), and enriched with additional features, such as nondeterminism and concurrency. We investigate the succinctness of such machines, and the extent to which this succinctness carries over to make the reasoning problem in propositional dynamic logic (PDL) more difficult. The two main results establish that each additional pebble provides inherent exponential power on both fronts. 1 Introduction 1.1 Background This paper continues our work in [H], [DH], [HRV], seeking exponential (or higher) discrepancies in the succinctness of finite automata when augmented with various additional mechanisms. It is wellknown, for example, that NFAs are exponentially more succinct than DFAs, in the following upper and lower bound senses (see [RS], [MF]): (i) Any NFA can be simulated by a DFA with at most an exponential growth in size; (ii) There is a family of regular sets, L n , for ...
On the Power of Bounded Concurrency I: Finite Automata
 Journal of the ACM
, 1994
"... We investigate the descriptive succinctness of three fundamental notions for modeling concurrency: nondeterminism and pure parallelism, the two facets of alternation, and bounded cooperative concurrency, whereby a system configuration consists of a bounded number of cooperating states. Our results a ..."
Abstract

Cited by 30 (6 self)
 Add to MetaCart
We investigate the descriptive succinctness of three fundamental notions for modeling concurrency: nondeterminism and pure parallelism, the two facets of alternation, and bounded cooperative concurrency, whereby a system configuration consists of a bounded number of cooperating states. Our results are couched in the general framework of finitestate automata, but hold for appropriate versions of most concurrent models of computation, such as Petri nets, statecharts or finitestate versions of concurrent programming languages. We exhibit exhaustive sets of upper and lower bounds on the relative succinctness of these features over * and ', establishing that: (1) Each of the three features represents an exponential saving in succinctness of the representa tion, in a manner that isindependent of the other two and additive with respect to them.
State Complexity of Basic Operations on Nondeterministic Finite Automata
 In Implementation and Application of Automata (CIAA ’02), LNCS 2608
, 2001
"... The state complexities of basic operations on nondeterministic finite automata (NFA) are investigated. In particular, we consider Boolean operations, catenation operations  concatenation, iteration, free iteration  and the reversal on NFAs that accept finite and infinite languages over arbitrar ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
The state complexities of basic operations on nondeterministic finite automata (NFA) are investigated. In particular, we consider Boolean operations, catenation operations  concatenation, iteration, free iteration  and the reversal on NFAs that accept finite and infinite languages over arbitrary alphabets. Most of the shown bounds are tight in the exact number of states, i.e. the number is sufficient and necessary in the worst case. For the intersection of finite languages and the complementation tight bounds in the order of magnitude are proved.
Experimental Evaluation of Classical Automata Constructions
 In LPAR 2005, LNCS 3835
, 2005
"... There are several algorithms for producing the canonical DFA from a given NFA. While the theoretical complexities of these algorithms are known, there has not been a systematic empirical comparison between them. In this work we propose a probabilistic framework for testing the performance of auto ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
There are several algorithms for producing the canonical DFA from a given NFA. While the theoretical complexities of these algorithms are known, there has not been a systematic empirical comparison between them. In this work we propose a probabilistic framework for testing the performance of automatatheoretic algorithms. We conduct a direct experimental comparison between Hopcroft 's and Brzozowski's algorithms. We show that while Hopcroft's algorithm has better overall performance, Brzozowski's algorithm performs better for "highdensity " NFA. We also consider the universality problem, which is traditionally solved explicitly via the subset construction. We propose an encoding that allows this problem to be solved symbolically via a modelchecker. We compare the performance of this approach to that of the standard explicit algorithm, and show that the explicit approach performs significantly better.
Nondeterministic Descriptional Complexity of Regular Languages
 International Journal of Foundations of Computer Science
, 2002
"... We investigate the descriptional complexity of operations on finite and infinite regular languages over unary and arbitrary alphabets. The languages are represented by nondeterministic finite automata (NFA). In particular, we consider Boolean operations, catenation operations  concatenation, itera ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
We investigate the descriptional complexity of operations on finite and infinite regular languages over unary and arbitrary alphabets. The languages are represented by nondeterministic finite automata (NFA). In particular, we consider Boolean operations, catenation operations  concatenation, iteration, free iteration  and the reversal. Most of the shown bounds are tight in the exact number of states, i.e. the number is sucient and necessary in the worst case. Otherwise tight bounds in the order of magnitude are shown.
Optimal Lower bounds on Regular Expression Size using Communication Complexity
 In: Proceedings of FoSSaCS: 273–286, LNCS 4962
, 2008
"... Abstract. The problem of converting deterministic finite automata into (short) regular expressions is considered. It is known that the required expression size is 2 Θ(n) in the worst case for infinite languages, and for finite languages it is n Ω(log log n) and n O(log n) , if the alphabet size grow ..."
Abstract

Cited by 10 (7 self)
 Add to MetaCart
Abstract. The problem of converting deterministic finite automata into (short) regular expressions is considered. It is known that the required expression size is 2 Θ(n) in the worst case for infinite languages, and for finite languages it is n Ω(log log n) and n O(log n) , if the alphabet size grows with the number of states n of the given automaton. A new lower bound method based on communication complexity for regular expression size is developed to show that the required size is indeed n Θ(log n). For constant alphabet size the best lower bound known to date is Ω(n 2), even when allowing infinite languages and nondeterministic finite automata. As the technique developed here works equally well for deterministic finite automata over binary alphabets, the lower bound is improved to n Ω(log n). 1