Results 1  10
of
15
A callbyname lambdacalculus machine
 Higher Order and Symbolic Computation
"... We present, in this paper, a particularly simple lazy machine which runs programs written in λcalculus. It was introduced by the present writer more than twenty years ago. It has been, since, used and implemented ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
We present, in this paper, a particularly simple lazy machine which runs programs written in λcalculus. It was introduced by the present writer more than twenty years ago. It has been, since, used and implemented
Semantic Types: A Fresh Look at the Ideal Model for Types
, 2004
"... We present a generalization of the ideal model for recursive polymorphic types. Types are defined as sets of terms instead of sets of elements of a semantic domain. Our proof of the existence of types (computed by fixpoint of a typing operator) does not rely on metric properties, but on the fact tha ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
We present a generalization of the ideal model for recursive polymorphic types. Types are defined as sets of terms instead of sets of elements of a semantic domain. Our proof of the existence of types (computed by fixpoint of a typing operator) does not rely on metric properties, but on the fact that the identity is the limit of a sequence of projection terms. This establishes a connection with the work of Pitts on relational properties of domains. This also suggests that ideals are better understood as closed sets of terms defined by orthogonality with respect to a set of contexts.
Recursive Polymorphic Types and Parametricity in an Operational Framework
, 2005
"... We construct a realizability model of recursive polymorphic types, starting from an untyped language of terms and contexts. An orthogonality relation e # indicates when a term e and a context # may be safely combined in the language. Types are interpreted as sets of terms closed by biorthogonalit ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
We construct a realizability model of recursive polymorphic types, starting from an untyped language of terms and contexts. An orthogonality relation e # indicates when a term e and a context # may be safely combined in the language. Types are interpreted as sets of terms closed by biorthogonality. Our main result states that recursive types are approximated by converging sequences of interval types. Our proof is based on a "typedirected" approximation technique, which departs from the "languagedirected" approximation technique developed by MacQueen, Plotkin and Sethi in the ideal model. We thus keep the language elementary (a callbyname #calculus) and unstratified (no typecase, no reduction labels). We also include a short account of parametricity, based on an orthogonality relation between quadruples of terms and contexts.
Semantic Barbs and Biorthogonality
"... Abstract. We use the framework of biorthogonality to introduce a novel semantic definition of the concept of barb (basic observable) for process calculi. We develop a uniform basic theory of barbs and demonstrate its robustness by showing that it gives rise to the correct observables in specific pro ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
Abstract. We use the framework of biorthogonality to introduce a novel semantic definition of the concept of barb (basic observable) for process calculi. We develop a uniform basic theory of barbs and demonstrate its robustness by showing that it gives rise to the correct observables in specific process calculi which model synchronous, asynchronous and broadcast communication regimes. 1
Subtyping Union Types
, 2004
"... Subtyping rules can be fairly complex for union types, due to interactions with other types, such as function types. Furthermore, these interactions turn out to depend on the calculus considered: for instance, a callbyvalue calculus and a callbyname calculus will have different possible subtypin ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Subtyping rules can be fairly complex for union types, due to interactions with other types, such as function types. Furthermore, these interactions turn out to depend on the calculus considered: for instance, a callbyvalue calculus and a callbyname calculus will have different possible subtyping rules. In order to abstract ourselves away from this dependence, we consider a fairly large class of calculi. We define types in a semantic fashion, as sets of terms. Then, a type can be a subtype of another type if its denotation is included in the denotation of the other type. We first consider a simple type system with union, function, pair and constant types. Using inference rules, we specify a subtyping relation which is both sound and complete with respect to the class of calculi. We then extend this result to a richer type system with MLstyle polymorphism and type constructors. We expect this framework to allow the study of subtyping relations that only hold for some calculi by restricting the class considered, and to allow the study of subtyping relations for richer type systems by enriching the class.
An algebraic process calculus
 In Proceedings of the twentythird annual IEEE symposium on logic in computer science (LICS
, 2008
"... We present an extension of the πIcalculus with formal sums of terms. The study of the properties of this sum reveals that its neutral element can be used to make assumptions about the behaviour of the environment of a process. Furthermore, the formal sum appears as a fundamental construct that can ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
We present an extension of the πIcalculus with formal sums of terms. The study of the properties of this sum reveals that its neutral element can be used to make assumptions about the behaviour of the environment of a process. Furthermore, the formal sum appears as a fundamental construct that can be used to decompose both internal and external choice. From these observations, we derive an enriched calculus that enjoys a confluent reduction which preserves the testing semantics of processes. This system is shown to be strongly normalising for terms without replication, and the study of its normal forms provides a fully abstract trace semantics for testing of πI processes. 1.
The Scott model of Linear Logic is the extensional collapse of its relational model
, 2011
"... We show that the extensional collapse of the relational model of linear logic is the model of primealgebraic complete lattices, a natural extension to linear logic of the well known Scott semantics of the lambdacalculus. ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We show that the extensional collapse of the relational model of linear logic is the model of primealgebraic complete lattices, a natural extension to linear logic of the well known Scott semantics of the lambdacalculus.
Disjunctive Normal Forms and Local Exceptions
, 2003
"... All classical λterms typable with disjunctive normal forms are shown to share a common computational behavior: they implement a local exception handling mechanism whose exact workings depend on the tautology. Equivalent and more efficient control combinators are described through a specializ ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
All classical λterms typable with disjunctive normal forms are shown to share a common computational behavior: they implement a local exception handling mechanism whose exact workings depend on the tautology. Equivalent and more efficient control combinators are described through a specialized sequent calculus and shown to be correct.