Results 1  10
of
110
Transition Network Grammars for Natural Language Analysis
 COMPUTATIONAL LINGUISTICS, D.G. BOBROW, EDITOR
, 1970
"... The use of augmented transition network grammars for the analysis of natural language sentences is described. Structurebuilding actions associated with the arcs of the grammar network allow for the reordering, restructuring, and copying of constituents necessary to produce deepstructure representa ..."
Abstract

Cited by 296 (3 self)
 Add to MetaCart
The use of augmented transition network grammars for the analysis of natural language sentences is described. Structurebuilding actions associated with the arcs of the grammar network allow for the reordering, restructuring, and copying of constituents necessary to produce deepstructure representations of the type normally obtained from a transformational analysis, and conditions on the arcs allow for a powerful selectivity which can rule out meaningless analyses and take advantage of semantic information to guide the parsing. The advantages of this model for natural language analysis are discussed in detail and illustrated by examples. An implementation of an experimental parsing system for transition network grammars is briefly described.
OneUnambiguous Regular Languages
 Information and computation
, 1997
"... The ISO standard for the Standard Generalized Markup Language (SGML) provides a syntactic metalanguage for the definition of textual markup systems. In the standard, the righthand sides of productions are based on regular expressions, although only regular expressions that denote words unambigu ..."
Abstract

Cited by 125 (9 self)
 Add to MetaCart
The ISO standard for the Standard Generalized Markup Language (SGML) provides a syntactic metalanguage for the definition of textual markup systems. In the standard, the righthand sides of productions are based on regular expressions, although only regular expressions that denote words unambiguously, in the sense of the ISO standard, are allowed. In general, a word that is denoted by a regular expression is witnessed by a sequence of occurrences of symbols in the regular expression that match the word. In an unambiguous regular expression as defined by Book, Even, Greibach, and Ott, each word has at most one witness. But the SGML standard also requires that a witness be computed incrementally from the word with a onesymbol lookahead; we call such regular expressions 1unambiguous. A regular language is a 1unambiguous language if it is denoted by some 1unambiguous regular expression. We give a Kleene theorem for 1unambiguous languages and characterize 1unambiguous regu...
Partial Derivatives of Regular Expressions and Finite Automata Constructions
 Theoretical Computer Science
, 1995
"... . We introduce a notion of a partial derivative of a regular expression. It is a generalization to the nondeterministic case of the known notion of a derivative invented by Brzozowski. We give a constructive definition of partial derivatives, study their properties, and employ them to develop a new ..."
Abstract

Cited by 82 (0 self)
 Add to MetaCart
(Show Context)
. We introduce a notion of a partial derivative of a regular expression. It is a generalization to the nondeterministic case of the known notion of a derivative invented by Brzozowski. We give a constructive definition of partial derivatives, study their properties, and employ them to develop a new algorithm for turning regular expressions into relatively small NFA and to provide certain improvements to Brzozowski's algorithm constructing DFA. We report on a prototype implementation of our algorithm constructing NFA and present some examples. Introduction In 1964 Janusz Brzozowski introduced word derivatives of regular expressions and suggested an elegant algorithm turning a regular expression r into a deterministic finite automata (DFA); the main point of the algorithm is that the word derivatives of r serve as states of the resulting DFA [5]. In the following years derivatives were recognized as a quite useful and productive tool. Conway [8] uses derivatives to present various comp...
Synthesizing Fast Intrusion Prevention/Detection Systems from HighLevel Specifications
 In USENIX Security Symposium
, 1999
"... To build survivable information systems (i.e., systems that continue to provide their services in spite of coordinated attacks), it is necessary to detect and isolate intrusions before they impact system performance or functionality. Previous research in this area has focussed primarily on detectin ..."
Abstract

Cited by 81 (17 self)
 Add to MetaCart
To build survivable information systems (i.e., systems that continue to provide their services in spite of coordinated attacks), it is necessary to detect and isolate intrusions before they impact system performance or functionality. Previous research in this area has focussed primarily on detecting intrusions after the fact, rather than preventing them in the first place. We have developed a new approach based on specifying intended program behaviors using patterns over sequences of system calls. The patterns can also capture conditions on the values of systemcall arguments. At runtime, we intercept the system calls made by processes, compare them against specifications, and disallow (or otherwise modify) those calls that deviate from specifications. Since our approach is capable of modifying a system call before it is delivered to the operating system kernel, it is capable of reacting before any damagecausing system call is executed by a process under attack. We present our specification language and illustrate its use by developing a specification for the ftp server. Observe that in our approach, every system call is intercepted and subject to potentially expensive operations for matching against many patterns that specify normal/abnormal behavior. Thus, minimizing the overheads incurred for patternmatching is critical for the viability of our approach. We solve this problem by developing a new, lowoverhead algorithm for matching runtime behaviors against specifications. A salient feature of our algorithm is that its runtime is almost independent of the number of patterns. In most cases, it uses a constant amount of time per system call intercepted, and uses a constant amount of storage, both independent of either the size or number of patterns. These benefits m...
Models of Computation  Exploring the Power of Computing
"... Theoretical computer science treats any computational subject for which a good model can be created. Research on formal models of computation was initiated in the 1930s and 1940s by Turing, Post, Kleene, Church, and others. In the 1950s and 1960s programming languages, language translators, and oper ..."
Abstract

Cited by 76 (6 self)
 Add to MetaCart
Theoretical computer science treats any computational subject for which a good model can be created. Research on formal models of computation was initiated in the 1930s and 1940s by Turing, Post, Kleene, Church, and others. In the 1950s and 1960s programming languages, language translators, and operating systems were under development and therefore became both the subject and basis for a great deal of theoretical work. The power of computers of this period was limited by slow processors and small amounts of memory, and thus theories (models, algorithms, and analysis) were developed to explore the efficient use of computers as well as the inherent complexity of problems. The former subject is known today as algorithms and data structures, the latter computational complexity. The focus of theoretical computer scientists in the 1960s on languages is reflected in the first textbook on the subject, Formal Languages and Their Relation to Automata by John Hopcroft and Jeffrey Ullman. This influential book led to the creation of many languagecentered theoretical computer science courses; many introductory theory courses today continue to reflect the content of this book and the interests of theoreticians of the 1960s and early 1970s. Although
Timed Regular Expressions
 Journal of the ACM
, 2001
"... In this paper we define timed regular expressions, a formalism for specifying discrete behaviors augmented with timing information, and prove that its expressive power is equivalent to the timed automata of Alur and Dill. This result is the timed analogue of Kleene Theorem and, similarly to that re ..."
Abstract

Cited by 62 (20 self)
 Add to MetaCart
(Show Context)
In this paper we define timed regular expressions, a formalism for specifying discrete behaviors augmented with timing information, and prove that its expressive power is equivalent to the timed automata of Alur and Dill. This result is the timed analogue of Kleene Theorem and, similarly to that result, the hard part in the proof is the translation from automata to expressions. This result is extended from finite to infinite (in the sense of B uchi) behaviors. In addition to these fundamental results, we give a clean algebraic framework for two commonlyaccepted formalism for timed behaviors, timeevent sequences and piecewiseconstant signals. 1
A Kleene Theorem for Timed Automata
, 1997
"... In this paper we define timed regular expressions, an extension of regular expressions for specifying sets of densetime discretevalued signals. We show that this formalism is equivalent in expressive power to the timed automata of Alur and Dill by providing a translation procedure from expressions ..."
Abstract

Cited by 62 (4 self)
 Add to MetaCart
(Show Context)
In this paper we define timed regular expressions, an extension of regular expressions for specifying sets of densetime discretevalued signals. We show that this formalism is equivalent in expressive power to the timed automata of Alur and Dill by providing a translation procedure from expressions to automata and vice versa. The result is extended to !regular expressions (B uchi's theorem). 1. Introduction Timed automata, i.e. automata equipped with clocks [AD94], have been studied extensively in recent years as they provide a rigorous model for reasoning about the quantitative temporal aspects of systems. Together with realtime logics and process algebras they constitute the underlying theoretical basis for the specification and verification of realtime systems. Kleene's theorem [K56], stating that the regular (or rational) subsets of \Sigma are exactly the recognizable ones (those accepted by finite automata), is one of the cornerstones of automata theory. No such theorem has ...
Interprocedural Transformations for Parallel Code Generation
 IN PROCEEDINGS OF SUPERCOMPUTING '91
, 1991
"... We present a new approach that enables compiler optimization of procedure calls and loop nests containing procedure calls. We introduce two interprocedural transformations that move loops across procedure boundaries, exposing them to traditional optimizations on loop nests. These transformations are ..."
Abstract

Cited by 41 (14 self)
 Add to MetaCart
(Show Context)
We present a new approach that enables compiler optimization of procedure calls and loop nests containing procedure calls. We introduce two interprocedural transformations that move loops across procedure boundaries, exposing them to traditional optimizations on loop nests. These transformations are incorporated into a code generation algorithm for a sharedmemory multiprocessor. The code generator relies on a machine model to estimate the expected benefits of loop parallelization and parallelismenhancing transformations. Several transformation strategies are explored and one that minimizes total execution time is selected. Efficient support of this strategy is provided by an existing interprocedural compilation system. We demonstrate the potential of these techniques by applying this code generation strategy to two scientific applications programs.
Efficient Regular Expression Evaluation: Theory to Practice
"... Several algorithms and techniques have been proposed recently to accelerate regular expression matching and enable deep packet inspection at line rate. This work aims to provide a comprehensive practical evaluation of existing techniques, extending them and analyzing their compatibility. The study f ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
(Show Context)
Several algorithms and techniques have been proposed recently to accelerate regular expression matching and enable deep packet inspection at line rate. This work aims to provide a comprehensive practical evaluation of existing techniques, extending them and analyzing their compatibility. The study focuses on two hardware architectures: memorybased ASICs and FPGAs. 1.