Results 1  10
of
71
FlowSensitive Type Qualifiers
, 2002
"... We present a system for extending standard type systems with flowsensitive type qualifiers. Users annotate their programs with type qualifiers, and inference checks that the annotations are correct. In our system only the type qualifiers are modeled flowsensitively  the underlying standard types ..."
Abstract

Cited by 360 (29 self)
 Add to MetaCart
We present a system for extending standard type systems with flowsensitive type qualifiers. Users annotate their programs with type qualifiers, and inference checks that the annotations are correct. In our system only the type qualifiers are modeled flowsensitively  the underlying standard types are unchanged, which allows us to obtain an efficient constraintbased inference algorithm that integrates flowinsensitive alias analysis, effect inference, and ideas from linear type systems to support strong updates. We demonstrate the usefulness of flowsensitive type qualifiers by finding a number of new locking bugs in the Linux kernel.
Implementing Layered Designs with Mixin Layers
 In ECCOP ’98: Proceedings of the 12th European Conference on ObjectOriented Programming
, 1998
"... Abstract. Mixin layers are a technique for implementing layered objectoriented subclasses (mixin classes) but scaled to a multipleclass granularity. We describe mixin layers from a programming language viewpoint, discuss checking the consistency of a mixin layer composition, and analyze the langua ..."
Abstract

Cited by 151 (20 self)
 Add to MetaCart
Abstract. Mixin layers are a technique for implementing layered objectoriented subclasses (mixin classes) but scaled to a multipleclass granularity. We describe mixin layers from a programming language viewpoint, discuss checking the consistency of a mixin layer composition, and analyze the language support issues involved. 1
Using dependent types to express modular structure
 In Thirteenth ACM Symposium on Principles of Programming Languages
, 1986
"... Several related typed languages for modular programming and data abstraction have been proposed recently, including Pebble, SOL, and ML modules. We review and compare the basic typetheoretic ideas behind these languages and evaluate how they ..."
Abstract

Cited by 125 (5 self)
 Add to MetaCart
Several related typed languages for modular programming and data abstraction have been proposed recently, including Pebble, SOL, and ML modules. We review and compare the basic typetheoretic ideas behind these languages and evaluate how they
An Overview of λProlog
 In Fifth International Logic Programming Conference
, 1988
"... Abstract: λProlog is a logic programming language that extends Prolog by incorporating notions of higherorder functions, λterms, higherorder unification, polymorphic types, and mechanisms for building modules and secure abstract data types. These new features are provided in a principled fashion ..."
Abstract

Cited by 99 (34 self)
 Add to MetaCart
Abstract: λProlog is a logic programming language that extends Prolog by incorporating notions of higherorder functions, λterms, higherorder unification, polymorphic types, and mechanisms for building modules and secure abstract data types. These new features are provided in a principled fashion by extending the classical firstorder theory of Horn clauses to the intuitionistic higherorder theory of hereditary Harrop formulas. The justification for considering this extension a satisfactory logic programming language is provided through the prooftheoretic notion of a uniform proof. The correspondence between each extension to Prolog and the new features in the stronger logical theory is discussed. Also discussed are various aspects of an experimental implementation of λProlog. Appears in the Fifth International Conference Symposium on Logic Programming, 15 – 19 August 1988, Seattle, Washington. This is a slightly corrected version of
Types for Dyadic Interaction
, 1993
"... We formulate a typed formalism for concurrency where types denote freely composable structure of dyadic interaction in the symmetric scheme. The resulting calculus is a typed reconstruction of name passing process calculi. Systems with both the explicit and implicit typing disciplines, where types f ..."
Abstract

Cited by 83 (10 self)
 Add to MetaCart
We formulate a typed formalism for concurrency where types denote freely composable structure of dyadic interaction in the symmetric scheme. The resulting calculus is a typed reconstruction of name passing process calculi. Systems with both the explicit and implicit typing disciplines, where types form a simple hierarchy of types, are presented, which are proved to be in accordance with each other. A typed variant of bisimilarity is formulated and it is shown that typed fiequality has a clean embedding in the bisimilarity. Name reference structure induced by the simple hierarchy of types is studied, which fully characterises the typable terms in the set of untyped terms. It turns out that the name reference structure results in the deadlockfree property for a subset of terms with a certain regular structure, showing behavioural significance of the simple type discipline. 1 Introduction This is a preliminary study of types for concurrency. Types here denote freely composable structur...
tps: A theorem proving system for classical type theory
 Journal of Automated Reasoning
, 1996
"... This is a description of TPS, a theorem proving system for classical type theory (Church’s typed λcalculus). TPS has been designed to be a general research tool for manipulating wffs of first and higherorder logic, and searching for proofs of such wffs interactively or automatically, or in a comb ..."
Abstract

Cited by 71 (6 self)
 Add to MetaCart
This is a description of TPS, a theorem proving system for classical type theory (Church’s typed λcalculus). TPS has been designed to be a general research tool for manipulating wffs of first and higherorder logic, and searching for proofs of such wffs interactively or automatically, or in a combination of these modes. An important feature of TPS is the ability to translate between expansion proofs and natural deduction proofs. Examples of theorems which TPS can prove completely automatically are given to illustrate certain aspects of TPS’s behavior and problems of theorem proving in higherorder logic. 7
Polymorphic versus monomorphic flowinsensitive pointsto analysis for C
 IN STATIC ANALYSIS SYMPOSIUM
, 2000
"... We carry out an experimental analysis for two of the design dimensions of flowinsensitive pointsto analysis for C: polymorphic versus monomorphic and equalitybased versus inclusionbased. Holding other analysis parameters fixed, we measure the precision of the four design points on a suite of be ..."
Abstract

Cited by 68 (3 self)
 Add to MetaCart
We carry out an experimental analysis for two of the design dimensions of flowinsensitive pointsto analysis for C: polymorphic versus monomorphic and equalitybased versus inclusionbased. Holding other analysis parameters fixed, we measure the precision of the four design points on a suite of benchmarks of up to 90,000 abstract syntax tree nodes. Our experiments show that the benefit of polymorphism varies significantly with the underlying monomorphic analysis. For our equalitybased analysis, adding polymorphism greatly increases precision, while for our inclusionbased analysis, adding polymorphism hardly makes any difference. We also gain some insight into the nature of polymorphism in pointsto analysis of C. In particular, we find considerable polymorphism available in function parameters, but little or no polymorphism in function results, and we show how this observation explains our results.
Concurrent Clean
, 1991
"... Concurrent Clean is an experimental, lazy, higherorder parallel functional programming language based on term graph rewriting. An important difference with other languages is that in Clean graphs are manipulated and not terms. This can be used by the programmer to control communication and sharing ..."
Abstract

Cited by 60 (4 self)
 Add to MetaCart
Concurrent Clean is an experimental, lazy, higherorder parallel functional programming language based on term graph rewriting. An important difference with other languages is that in Clean graphs are manipulated and not terms. This can be used by the programmer to control communication and sharing of computation. Cyclic structures can be defined. Concurrent Clean furthermore allows to control the (parallel) order of evaluation to make efficient evaluation possible. With help of sequential annotations the default lazy evaluation can be locally changed into eager evaluation. The language enables the definition of partially strict data structures which make a whole new class of algorithms feasible in a functional language. A powerful and fast strictness analyser is incorporated in the system. The quality of the code generated by the Clean compiler has been greatly improved such that it is one of the best code generators for a lazy functional language. Two very powerful parall...
Mechanizing Programming Logics in Higher Order Logic
 in Current Trends in Hardware Verification and Automated Theorem Proving, ed. P.A. Subrahmanyam and Graham Birtwistle
, 1989
"... Formal reasoning about computer programs can be based directly on the semantics of the programming language, or done in a special purpose logic like Hoare logic. The advantage of the first approach is that it guarantees that the formal reasoning applies to the language being used (it is well known, ..."
Abstract

Cited by 59 (3 self)
 Add to MetaCart
Formal reasoning about computer programs can be based directly on the semantics of the programming language, or done in a special purpose logic like Hoare logic. The advantage of the first approach is that it guarantees that the formal reasoning applies to the language being used (it is well known, for example, that Hoare’s assignment axiom fails to hold for most programming languages). The advantage of the second approach is that the proofs can be more direct and natural. In this paper, an attempt to get the advantages of both approaches is described. The rules of Hoare logic are mechanically derived from the semantics of a simple imperative programming language (using the HOL system). These rules form the basis for a simple program verifier in which verification conditions are generated by LCFstyle tactics whose validations use the derived Hoare rules. Because Hoare logic is derived, rather than postulated, it is straightforward to mix semantic and axiomatic reasoning. It is also straightforward to combine the constructs of Hoare logic with other applicationspecific notations. This is briefly illustrated for various logical constructs, including termination statements, VDMstyle ‘relational’ correctness specifications, weakest precondition statements and dynamic logic formulae. The theory underlying the work presented here is well known. Our contribution is to propose a way of mechanizing this theory in a way that makes certain practical details work out smoothly.
Classical Logic and Computation
, 2000
"... This thesis contains a study of the proof theory of classical logic and addresses the problem of giving a computational interpretation to classical proofs. This interpretation aims to capture features of computation that go beyond what can be expressed in intuitionisticlogic. We introduce several ..."
Abstract

Cited by 58 (7 self)
 Add to MetaCart
This thesis contains a study of the proof theory of classical logic and addresses the problem of giving a computational interpretation to classical proofs. This interpretation aims to capture features of computation that go beyond what can be expressed in intuitionisticlogic. We introduce several strongly normalising cutelimination procedures for classicallogic. Our procedures are less restrictive than previous strongly normalising procedures, while at the same time retaining the strong normalisation property, which various standardcutelimination procedures lack. In order to apply proof techniques from term rewriting, including symmetric reducibility candidates and recursive path ordering, we develop termannotations for sequent proofs of classical logic. We then present a sequenceconclusion natural deduction calculus for classical logicand study the correspondence between cutelimination and normalisation. In contrast to earlier work, which analysed this correspondence in various fragments of intuitionisticlogic, we establish the correspondence in classical logic. Finally, we study applications of cutelimination. In particular, we analyse severalclassical proofs with respect to their behaviour under cutelimination. Because our cutelimination procedures impose fewer constraints than previous procedures, we are ableto show how a fragment of classical logic can be seen as a typing system for the simplytyped lambda calculus extended with an erratic choice operator. As a pleasing consequence, we can give a simple computational interpretation to Lafont's example.