Results 1 
7 of
7
From Operational to Denotational Semantics
 In MFPS 1991
, 1989
"... In this paper it is shown how operational semantic methods may be naturally extended to encompass many of the concepts of denotational semantics. This work builds on the standard development of an operational semantics as an interpreter and operational equivalence. The key addition is an operational ..."
Abstract

Cited by 18 (6 self)
 Add to MetaCart
In this paper it is shown how operational semantic methods may be naturally extended to encompass many of the concepts of denotational semantics. This work builds on the standard development of an operational semantics as an interpreter and operational equivalence. The key addition is an operational ordering on sets of terms. From properties of this ordering a closure construction directly yields a fully abstract continuous cpo model. Furthermore, it is not necessary to construct the cpo, for principles such as soundness of fixedpoint induction may be obtained by direct reasoning from this new ordering. The end result is that traditional denotational techniques may be applied in a purely operational setting in a natural fashion, a matter of practical importance for developing semantics of realistic programming languages. 1 Introduction This paper aims to accomplish a degree of unification between operational and denotational approaches to programming language semantics by recasting d...
Compositional Characterizations of λterms using Intersection Types (Extended Abstract)
, 2000
"... We show how to characterize compositionally a number of evaluation properties of λterms using Intersection Type assignment systems. In particular, we focus on termination properties, such as strong normalization, normalization, head normalization, and weak head normalization. We consider also the ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
We show how to characterize compositionally a number of evaluation properties of λterms using Intersection Type assignment systems. In particular, we focus on termination properties, such as strong normalization, normalization, head normalization, and weak head normalization. We consider also the persistent versions of such notions. By way of example, we consider also another evaluation property, unrelated to termination, namely reducibility to a closed term. Many of these characterization results are new, to our knowledge, or else they streamline, strengthen, or generalize earlier results in the literature. The completeness parts of the characterizations are proved uniformly for all the properties, using a settheoretical semantics of intersection types over suitable kinds of stable sets. This technique generalizes Krivine 's and Mitchell's methods for strong normalization to other evaluation properties.
Simple easy terms
 Intersection Types and Related Systems, volume 70 of Electronic Notes in Computer Science
, 2002
"... Dipartimento di Informatica Universit`a di Venezia ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
Dipartimento di Informatica Universit`a di Venezia
Intersection Types and Lambda Models
, 2005
"... Invariance of interpretation by #conversion is one of the minimal requirements for any standard model for the #calculus. With the intersection type systems being a general framework for the study of semantic domains for the #calculus, the present paper provides a (syntactic) characterisation of t ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
Invariance of interpretation by #conversion is one of the minimal requirements for any standard model for the #calculus. With the intersection type systems being a general framework for the study of semantic domains for the #calculus, the present paper provides a (syntactic) characterisation of the above mentioned requirement in terms of characterisation results for intersection type assignment systems.
A First Order Logic of Effects
 Theoretical Computer Science
, 1996
"... In this paper we describe some of our progress towards an operational implementation of a modern programming logic. The logic is inspired by the variable type systems of Feferman, and is designed for reasoning about imperative functional programs. The logic goes well beyond traditional programming l ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
In this paper we describe some of our progress towards an operational implementation of a modern programming logic. The logic is inspired by the variable type systems of Feferman, and is designed for reasoning about imperative functional programs. The logic goes well beyond traditional programming logics, such as Hoare's logic and Dynamic logic in its expressibility, yet is less problematic to encode into higher order logics. The main focus of the paper is too present an axiomatization of the base first order theory. 1 Introduction VTLoE [34, 23, 35, 37, 24] is a logic for reasoning about imperative functional programs, inspired by the variable type systems of Feferman. These systems are two sorted theories of operations and classes initially developed for the formalization of constructive mathematics [12, 13] and later applied to the study of purely functional languages [14, 15]. VTLoE builds upon recent advances in the semantics of languages with effects [16, 19, 28, 32, 33] and go...
The CallByValue LambdaCalculus: A Semantic Investigation
"... In this work we present a categorical approach for modeling the pure (i.e., without constants) callbyvalue calculus, defined by Plotkin as a restriction of the classical one. In particular, we study the properties a category must enjoy for give rise to a model of such a language. This definition ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
In this work we present a categorical approach for modeling the pure (i.e., without constants) callbyvalue calculus, defined by Plotkin as a restriction of the classical one. In particular, we study the properties a category must enjoy for give rise to a model of such a language. This definition is enough general for grasping models in different settings. 1 Introduction The callbyvalue calculus is a restriction of the classical calculus (ficalculus, for short), based on the notion of value. A value is a term which is either a variable or an abstraction. In particular, the callbyvalue calculus is obtained from the classical one by restricting the evaluation rule (the firule) to redexes whose operand is a value. This leads to a callbyvalue parameter passing mechanism, which is a feature present in many real programming languages, where an evaluation is callby value if it evaluates parameters before they have been passed. This feature, together with the lazy evaluation, wh...
Compositional Characterisations of λterms using Intersection Types
, 2003
"... We show how to characterise compositionally a number of evaluation properties of λterms using Intersection Type assignment systems. In particular, we focus on termination properties, such as strong normalisation, normalisation, head normalisation, and weak head normalisation. We consider also the p ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We show how to characterise compositionally a number of evaluation properties of λterms using Intersection Type assignment systems. In particular, we focus on termination properties, such as strong normalisation, normalisation, head normalisation, and weak head normalisation. We consider also the persistent versions of such notions. By way of example, we consider also another evaluation property, unrelated to termination, namely reducibility to a closed term. Many of these characterisation results are new, to our knowledge, or else they streamline, strengthen, or generalise earlier results in the literature. The completeness parts of the characterisations are proved uniformly for all the properties, using a settheoretical semantics of intersection types over suitable kinds of stable sets. This technique generalises Krivine's and Mitchell's methods for strong normalisation to other evaluation properties.