Results 1  10
of
78
OneUnambiguous Regular Languages
 Information and computation
, 1997
"... The ISO standard for the Standard Generalized Markup Language (SGML) provides a syntactic metalanguage for the definition of textual markup systems. In the standard, the righthand sides of productions are based on regular expressions, although only regular expressions that denote words unambigu ..."
Abstract

Cited by 101 (9 self)
 Add to MetaCart
The ISO standard for the Standard Generalized Markup Language (SGML) provides a syntactic metalanguage for the definition of textual markup systems. In the standard, the righthand sides of productions are based on regular expressions, although only regular expressions that denote words unambiguously, in the sense of the ISO standard, are allowed. In general, a word that is denoted by a regular expression is witnessed by a sequence of occurrences of symbols in the regular expression that match the word. In an unambiguous regular expression as defined by Book, Even, Greibach, and Ott, each word has at most one witness. But the SGML standard also requires that a witness be computed incrementally from the word with a onesymbol lookahead; we call such regular expressions 1unambiguous. A regular language is a 1unambiguous language if it is denoted by some 1unambiguous regular expression. We give a Kleene theorem for 1unambiguous languages and characterize 1unambiguous regu...
Synthesizing Fast Intrusion Prevention/Detection Systems from HighLevel Specifications
 In USENIX Security Symposium
, 1999
"... To build survivable information systems (i.e., systems that continue to provide their services in spite of coordinated attacks), it is necessary to detect and isolate intrusions before they impact system performance or functionality. Previous research in this area has focussed primarily on detectin ..."
Abstract

Cited by 72 (16 self)
 Add to MetaCart
To build survivable information systems (i.e., systems that continue to provide their services in spite of coordinated attacks), it is necessary to detect and isolate intrusions before they impact system performance or functionality. Previous research in this area has focussed primarily on detecting intrusions after the fact, rather than preventing them in the first place. We have developed a new approach based on specifying intended program behaviors using patterns over sequences of system calls. The patterns can also capture conditions on the values of systemcall arguments. At runtime, we intercept the system calls made by processes, compare them against specifications, and disallow (or otherwise modify) those calls that deviate from specifications. Since our approach is capable of modifying a system call before it is delivered to the operating system kernel, it is capable of reacting before any damagecausing system call is executed by a process under attack. We present our specification language and illustrate its use by developing a specification for the ftp server. Observe that in our approach, every system call is intercepted and subject to potentially expensive operations for matching against many patterns that specify normal/abnormal behavior. Thus, minimizing the overheads incurred for patternmatching is critical for the viability of our approach. We solve this problem by developing a new, lowoverhead algorithm for matching runtime behaviors against specifications. A salient feature of our algorithm is that its runtime is almost independent of the number of patterns. In most cases, it uses a constant amount of time per system call intercepted, and uses a constant amount of storage, both independent of either the size or number of patterns. These benefits m...
Partial Derivatives of Regular Expressions and Finite Automata Constructions
 Theoretical Computer Science
, 1995
"... . We introduce a notion of a partial derivative of a regular expression. It is a generalization to the nondeterministic case of the known notion of a derivative invented by Brzozowski. We give a constructive definition of partial derivatives, study their properties, and employ them to develop a new ..."
Abstract

Cited by 59 (0 self)
 Add to MetaCart
. We introduce a notion of a partial derivative of a regular expression. It is a generalization to the nondeterministic case of the known notion of a derivative invented by Brzozowski. We give a constructive definition of partial derivatives, study their properties, and employ them to develop a new algorithm for turning regular expressions into relatively small NFA and to provide certain improvements to Brzozowski's algorithm constructing DFA. We report on a prototype implementation of our algorithm constructing NFA and present some examples. Introduction In 1964 Janusz Brzozowski introduced word derivatives of regular expressions and suggested an elegant algorithm turning a regular expression r into a deterministic finite automata (DFA); the main point of the algorithm is that the word derivatives of r serve as states of the resulting DFA [5]. In the following years derivatives were recognized as a quite useful and productive tool. Conway [8] uses derivatives to present various comp...
Models of Computation  Exploring the Power of Computing
"... Theoretical computer science treats any computational subject for which a good model can be created. Research on formal models of computation was initiated in the 1930s and 1940s by Turing, Post, Kleene, Church, and others. In the 1950s and 1960s programming languages, language translators, and oper ..."
Abstract

Cited by 57 (7 self)
 Add to MetaCart
Theoretical computer science treats any computational subject for which a good model can be created. Research on formal models of computation was initiated in the 1930s and 1940s by Turing, Post, Kleene, Church, and others. In the 1950s and 1960s programming languages, language translators, and operating systems were under development and therefore became both the subject and basis for a great deal of theoretical work. The power of computers of this period was limited by slow processors and small amounts of memory, and thus theories (models, algorithms, and analysis) were developed to explore the efficient use of computers as well as the inherent complexity of problems. The former subject is known today as algorithms and data structures, the latter computational complexity. The focus of theoretical computer scientists in the 1960s on languages is reflected in the first textbook on the subject, Formal Languages and Their Relation to Automata by John Hopcroft and Jeffrey Ullman. This influential book led to the creation of many languagecentered theoretical computer science courses; many introductory theory courses today continue to reflect the content of this book and the interests of theoreticians of the 1960s and early 1970s. Although
A Kleene Theorem for Timed Automata
, 1997
"... In this paper we define timed regular expressions, an extension of regular expressions for specifying sets of densetime discretevalued signals. We show that this formalism is equivalent in expressive power to the timed automata of Alur and Dill by providing a translation procedure from expressions ..."
Abstract

Cited by 55 (2 self)
 Add to MetaCart
In this paper we define timed regular expressions, an extension of regular expressions for specifying sets of densetime discretevalued signals. We show that this formalism is equivalent in expressive power to the timed automata of Alur and Dill by providing a translation procedure from expressions to automata and vice versa. The result is extended to !regular expressions (B uchi's theorem). 1. Introduction Timed automata, i.e. automata equipped with clocks [AD94], have been studied extensively in recent years as they provide a rigorous model for reasoning about the quantitative temporal aspects of systems. Together with realtime logics and process algebras they constitute the underlying theoretical basis for the specification and verification of realtime systems. Kleene's theorem [K56], stating that the regular (or rational) subsets of \Sigma are exactly the recognizable ones (those accepted by finite automata), is one of the cornerstones of automata theory. No such theorem has ...
Timed Regular Expressions
 Journal of the ACM
, 2001
"... In this paper we define timed regular expressions, a formalism for specifying discrete behaviors augmented with timing information, and prove that its expressive power is equivalent to the timed automata of Alur and Dill. This result is the timed analogue of Kleene Theorem and, similarly to that re ..."
Abstract

Cited by 48 (15 self)
 Add to MetaCart
In this paper we define timed regular expressions, a formalism for specifying discrete behaviors augmented with timing information, and prove that its expressive power is equivalent to the timed automata of Alur and Dill. This result is the timed analogue of Kleene Theorem and, similarly to that result, the hard part in the proof is the translation from automata to expressions. This result is extended from finite to infinite (in the sense of B uchi) behaviors. In addition to these fundamental results, we give a clean algebraic framework for two commonlyaccepted formalism for timed behaviors, timeevent sequences and piecewiseconstant signals. 1
Interprocedural Transformations for Parallel Code Generation
 IN PROCEEDINGS OF SUPERCOMPUTING '91
, 1991
"... We present a new approach that enables compiler optimization of procedure calls and loop nests containing procedure calls. We introduce two interprocedural transformations that move loops across procedure boundaries, exposing them to traditional optimizations on loop nests. These transformations are ..."
Abstract

Cited by 41 (14 self)
 Add to MetaCart
We present a new approach that enables compiler optimization of procedure calls and loop nests containing procedure calls. We introduce two interprocedural transformations that move loops across procedure boundaries, exposing them to traditional optimizations on loop nests. These transformations are incorporated into a code generation algorithm for a sharedmemory multiprocessor. The code generator relies on a machine model to estimate the expected benefits of loop parallelization and parallelismenhancing transformations. Several transformation strategies are explored and one that minimizes total execution time is selected. Efficient support of this strategy is provided by an existing interprocedural compilation system. We demonstrate the potential of these techniques by applying this code generation strategy to two scientific applications programs.
From Regular Expressions to DFA's Using Compressed NFA's
 Theoretical Computer Science
, 1992
"... To my parents and uncle Frank ..."
Standard Generalized Markup Language: Mathematical and Philosophical Issues
 Computer Science Today. Recent Trends and Developments
, 1995
"... . The Standard Generalized Markup Language (SGML), an ISO standard, has become the accepted method of defining markup conventions for text files. SGML is a metalanguage for defining grammars for textual markup in much the same way that BackusNaur Form is a metalanguage for defining programming ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
. The Standard Generalized Markup Language (SGML), an ISO standard, has become the accepted method of defining markup conventions for text files. SGML is a metalanguage for defining grammars for textual markup in much the same way that BackusNaur Form is a metalanguage for defining programminglanguage grammars. Indeed, HTML, the method of marking up a hypertext documents for the World Wide Web, is an SGML grammar. The underlying assumptions of the SGML initiative are that a logical structure of a document can be identified and that it can be indicated by the insertion of labeled matching brackets (start and end tags). Moreover, it is assumed that the nesting relationships of these tags can be described with an extended contextfree grammar (the righthand sides of productions are regular expressions). In this survey of some of the issues raised by the SGML initiative, I reexamine the underlying assumptions and address some of the theoretical questions that SGML raises....
Canonical derivatives, partial derivatives and finite automaton constructions
 Theor. Comput. Sci
"... Let E be a regular expression. Our aim is to establish a theoretical relation between two wellknown automata recognizing the language of E, namely the position automaton PE constructed by Glushkov or McNaughton and Yamada, and the equation automaton EE constructed by Mirkin or Antimirov. We define ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
Let E be a regular expression. Our aim is to establish a theoretical relation between two wellknown automata recognizing the language of E, namely the position automaton PE constructed by Glushkov or McNaughton and Yamada, and the equation automaton EE constructed by Mirkin or Antimirov. We define the notion of cderivative (for canonical derivative) of a regular expression E and show that if E is linear then two Brzozowski’s derivatives of E are acisimilar if and only if the corresponding cderivatives are identical. It allows us to represent the BerrySethi’s set of continuations of a position by a unique cderivative, called the ccontinuation of the position. Hence the definition of CE, the ccontinuation automaton of E, whose states are pairs made of a position of E and of the associated ccontinuation. If states are viewed as positions, CE is isomorphic to PE. On the other hand, a partial derivative, as defined by Antimirov, is a class of cderivatives for some equivalence relation, thus CE reduces to EE. Finally CE makes it possible to go from PE to EE, while this cannot be achieved directly (from the state graphs). These theoretical results lead to an O(E  2) space and time algorithm to compute the equation automaton, where E  is the size of the expression. This is the complexity of the most efficient constructions yielding the position automaton, while the size of the equation automaton is not greater and generally much smaller than the size of the position automaton.