Results 1  10
of
13
Efficient Type Inference for HigherOrder BindingTime Analysis
 In Functional Programming and Computer Architecture
, 1991
"... Bindingtime analysis determines when variables and expressions in a program can be bound to their values, distinguishing between early (compiletime) and late (runtime) binding. Bindingtime information can be used by compilers to produce more efficient target programs by partially evaluating prog ..."
Abstract

Cited by 91 (4 self)
 Add to MetaCart
Bindingtime analysis determines when variables and expressions in a program can be bound to their values, distinguishing between early (compiletime) and late (runtime) binding. Bindingtime information can be used by compilers to produce more efficient target programs by partially evaluating programs at compiletime. Bindingtime analysis has been formulated in abstract interpretation contexts and more recently in a typetheoretic setting. In a typetheoretic setting bindingtime analysis is a type inference problem: the problem of inferring a completion of a λterm e with bindingtime annotations such that e satisfies the typing rules. Nielson and Nielson and Schmidt have shown that every simply typed λterm has a unique completion ê that minimizes late binding in TML, a monomorphic type system with explicit bindingtime annotations, and they present exponential time algorithms for computing such minimal completions. 1 Gomard proves the same results for a variant of his twolevel λcalculus without a socalled “lifting ” rule. He presents another algorithm for inferring completions in this somewhat restricted type system and states that it can be implemented in time O(n 3). He conjectures that the completions computed are minimal.
Type inference and semiunification
 In Proceedings of the ACM Conference on LISP and Functional Programming (LFP ) (Snowbird
, 1988
"... In the last ten years declarationfree programming languages with a polymorphic typing discipline (ML, B) have been developed to approximate the flexibility and conciseness of dynamically typed languages (LISP, SETL) while retaining the safety and execution efficiency of conventional statically type ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
In the last ten years declarationfree programming languages with a polymorphic typing discipline (ML, B) have been developed to approximate the flexibility and conciseness of dynamically typed languages (LISP, SETL) while retaining the safety and execution efficiency of conventional statically typed languages (Algol68, Pascal). These polymorphic languages can be type checked at compile time, yet allow functions whose arguments range over a variety of types. We investigate several polymorphic type systems, the most powerful of which, termed MilnerMycroft Calculus, extends the socalled letpolymorphism found in, e.g., ML with a polymorphic typing rule for recursive definitions. We show that semiunification, the problem of solving inequalities over firstorder terms, characterizes type checking in the MilnerMycroft Calculus to polynomial time, even in the restricted case where nested definitions are disallowed. This permits us to extend some infeasibility results for related combinatorial problems to type inference and to correct several claims and statements in the literature. We prove the existence of unique most general solutions of term inequalities, called most general semiunifiers, and present an algorithm for computing them that terminates for all known inputs due to a novel “extended occurs check”. We conjecture this algorithm to be
A Study of Semantics, Types, and Languages for Databases and Object Oriented Programming
, 1989
"... The purpose of this thesis is to investigate a type system for databases and objectoriented programming and to design a statically typed programming language for these applications. Such a language should ideally have a static type system that supports: • polymorphism and static type inference, • r ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
The purpose of this thesis is to investigate a type system for databases and objectoriented programming and to design a statically typed programming language for these applications. Such a language should ideally have a static type system that supports: • polymorphism and static type inference, • rich data structures and operations to represent various data models for databases including the relational model and more recent complex object models, • central features of objectoriented programming including user definable class hierarchies, multiple inheritance, and data abstraction, • the notion of extents and objectidentities for objectoriented databases. Without a proper formalism, it is not obvious that the construction of such a type system is possible. This thesis attempts to construct one such formalism and proposes a programming language that uniformly integrate all of the above features. The specific contributions of this thesis include: • A simple semantics for ML polymorphism and axiomatization of the equational theory of ML. • A uniform generalization of the relational model to arbitrary complex database objects that
Antimirov and Mosses’s Rewrite System Revisited
, 2008
"... Antimirov and Mosses proposed a rewrite system for deciding the equivalence of two (extended) regular expressions. In this paper we present a functional approach to that method, prove its correctness, and give some experimental comparative results. Besides an improved version of Antimirov and Mosses ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Antimirov and Mosses proposed a rewrite system for deciding the equivalence of two (extended) regular expressions. In this paper we present a functional approach to that method, prove its correctness, and give some experimental comparative results. Besides an improved version of Antimirov and Mosses’s algorithm, we present a version using partial derivatives. Our preliminary results lead to the conclusion that, indeed, these methods are feasible and, generally, faster than the classical methods.
Checking NFA equivalence with bisimulations up to congruence
"... Abstract—We introduce bisimulation up to congruence as a technique for proving language equivalence of nondeterministic finite automata. Exploiting this technique, we devise an optimisation of the classical algorithm by Hopcroft and Karp [12] that, instead of computing the whole determinised automa ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract—We introduce bisimulation up to congruence as a technique for proving language equivalence of nondeterministic finite automata. Exploiting this technique, we devise an optimisation of the classical algorithm by Hopcroft and Karp [12] that, instead of computing the whole determinised automata, explores only a small portion of it. Although the optimised algorithm remains exponential in worst case (the problem is PSPACEcomplete), experimental results show improvements of several orders of magnitude over the standard algorithm. I.
A.A.: Managing intrusion detection rule sets
 In: Proceedings of EUROSEC
, 2010
"... The prevalent use of the signaturebased approach in modern intrusion detection systems (IDS) emphasizes the importance of the efficient management of the employed signature sets. With the constant discovery of new threats and vulnerabilities, the complexity and size of signature sets reach the poin ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The prevalent use of the signaturebased approach in modern intrusion detection systems (IDS) emphasizes the importance of the efficient management of the employed signature sets. With the constant discovery of new threats and vulnerabilities, the complexity and size of signature sets reach the point where the manual management of rules becomes a challenging (if not impossible) task for the system administrators. While the automated support of signature management is desirable, the main difficulty that arises in this context is the diversity in syntactical representations of signatures generally allowed in IDS. In this paper, we focus on the automated approach to signature management. Specifically, we propose a model for signature analysis that brings out the semantic inconsistencies in the IDS rule sets. To address the syntactical diversity of the signatures, we use the strengths of a nondeterministic automaton (NFA) and model the individual rules as finite machines to analyze their equivalence. The effectiveness of the proposed approach is evaluated on two collections of attack signatures: the rule sets of the open source Snort IDS and Bleeding Edge Threats. 1.
Finite Automata (NFAs) and Deterministic Finite Automata (DFAs). It is
, 2006
"... This project will focus on finite automata, including both Nondeterministic ..."
Abstract
 Add to MetaCart
This project will focus on finite automata, including both Nondeterministic
Backtracking
"... Contents 1 Introduction 3 2 Models of computation 6 3 The Set Union Problem 9 4 The WorstCase Time Complexity of a Single Operation 15 5 The Set Union Problem with Deunions 18 6 Split and the Set Union Problem on Intervals 22 7 The Set Union Problem with Unlimited Backtracking 26 1 Introduction A ..."
Abstract
 Add to MetaCart
Contents 1 Introduction 3 2 Models of computation 6 3 The Set Union Problem 9 4 The WorstCase Time Complexity of a Single Operation 15 5 The Set Union Problem with Deunions 18 6 Split and the Set Union Problem on Intervals 22 7 The Set Union Problem with Unlimited Backtracking 26 1 Introduction An equivalence relation on a finite set S is a binary relation that is reflexive symmetric and transitive. That is, for s; t and u in S, we have that sRs, if sRt then tRs, and if sRt and tRu then sRu. Set S is partitioned by R into equivalence classes where each class cointains all and only the elements that obey R pairwise. Many computational problems involve representing, modifying and tracking the evolution of equivalenc
A Treatment of Negative Descriptions of Typed Feature Structures
"... A formal treatment of typed feature structures (TFSs) is developed to augncnt TFSs, so that negative descriptions of then can be treated. Negative descriptions of TFSs can make linguistic descriptions compact and thus easy to understand. Negative descriptions can be classified into three primitive n ..."
Abstract
 Add to MetaCart
A formal treatment of typed feature structures (TFSs) is developed to augncnt TFSs, so that negative descriptions of then can be treated. Negative descriptions of TFSs can make linguistic descriptions compact and thus easy to understand. Negative descriptions can be classified into three primitive negative descriptions: (1) negations of type symbols, (2) negations of feature existences, and (3) negations of featureaddress value agreements. The forrealization proposed in this paper is based on A'/tKaci's complex terms. The first description is treated by extend ing type symbol lattices to include complement type symbols. The second and third are treated by augmeuting term structures with strncturcs representing these negations. Algorithrm for augnentcdTFS nnification have been developed using graph nnification, and programs using these algorithms have been writ ten in C:ommon Lisp.
New
, 2012
"... We propose a theoretical device for modeling the creation of new indiscernible semantic objects during program execution. The method fits well with the semantics of imperative, functional, and objectoriented languages and promotes equational reasoning about higherorder state. 1 ..."
Abstract
 Add to MetaCart
We propose a theoretical device for modeling the creation of new indiscernible semantic objects during program execution. The method fits well with the semantics of imperative, functional, and objectoriented languages and promotes equational reasoning about higherorder state. 1