Results 1  10
of
10
ParameterPassing and the Lambda Calculus
, 1991
"... The choice of a parameterpassing technique is an important decision in the design of a highlevel programming language. To clarify some of the semantic aspects of the decision, we develop, analyze, and compare modifications of the calculus for the most common parameterpassing techniques, i.e., ca ..."
Abstract

Cited by 186 (23 self)
 Add to MetaCart
The choice of a parameterpassing technique is an important decision in the design of a highlevel programming language. To clarify some of the semantic aspects of the decision, we develop, analyze, and compare modifications of the calculus for the most common parameterpassing techniques, i.e., callbyvalue and callbyname combined with passbyworth and passby reference, respectively. More specifically, for each parameterpassing technique we provide 1. a program rewriting semantics for a language with sideeffects and firstclass procedures based on the respective parameterpassing technique; 2. an equational theory that is derived from the rewriting semantics in a uniform manner; 3. a formal analysis of the correspondence between the calculus and the semantics; and 4. a strong normalization theorem for the imperative fragment of the theory (when applicable). A comparison of the various systems reveals that Algol's callbyname indeed satisfies the wellknown fi rule of the orig...
On reductionbased process semantics
 Theoretical Computer Science
, 1995
"... Abstract. A formulation of semantic theories for processes which is based on reduction relation and equational reasoning is studied. The new construction can induce meaningful theories for processes, both in strong and weak settings. The resulting theories in many cases coincide with, and sometimes ..."
Abstract

Cited by 144 (21 self)
 Add to MetaCart
Abstract. A formulation of semantic theories for processes which is based on reduction relation and equational reasoning is studied. The new construction can induce meaningful theories for processes, both in strong and weak settings. The resulting theories in many cases coincide with, and sometimes generalise, observationbased formulation of behavioural equivalence. The basic construction of reductionbased theories is studied, taking a simple name passing calculus called \nucalculus as an example. Results on other calculi are also briefly discussed.
Relative Normalization in Deterministic Residual Structures
 In: Proc. of the 19 th International Colloquium on Trees in Algebra and Programming, CAAP'96, Springer LNCS
, 1996
"... . This paper generalizes the Huet and L'evy theory of normalization by neededness to an abstract setting. We define Stable Deterministic Residual Structures (SDRS) and Deterministic Family Structures (DFS) by axiomatizing some properties of the residual relation and the family relation on redexes in ..."
Abstract

Cited by 17 (13 self)
 Add to MetaCart
. This paper generalizes the Huet and L'evy theory of normalization by neededness to an abstract setting. We define Stable Deterministic Residual Structures (SDRS) and Deterministic Family Structures (DFS) by axiomatizing some properties of the residual relation and the family relation on redexes in an Abstract Rewriting System. We present two proofs of the Relative Normalization Theorem, one for SDRSs for regular stable sets, and another for DFSs for all stable sets of desirable `normal forms'. We further prove the Relative Optimality Theorem for DFSs. We extend this result to deterministic Computation Structures which are deterministic Event Structures with an extra relation expressing selfessentiality. 1 Introduction A normalizable term, in a rewriting system, may have an infinite reduction, so it is important to have a normalizing strategy which enables one to construct reductions to normal form. It is well known that the leftmostoutermost strategy is normalizing in the calc...
The LambdaCalculus with Multiplicities
, 1993
"... We introduce a refinement of the λcalculus, where the argument of a function is a bag of resources, that is a multiset of terms, whose multiplicities indicate how many copies of them are available. We show that this "λcalculus with multiplicities" has a natural functionality theory, similar to Cop ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
We introduce a refinement of the λcalculus, where the argument of a function is a bag of resources, that is a multiset of terms, whose multiplicities indicate how many copies of them are available. We show that this "λcalculus with multiplicities" has a natural functionality theory, similar to Coppo and Dezani's intersection type discipline. In our functionality theory the conjunction is managed in a "multiplicative" manner, according to Girard's terminology. We show that this provides an adequate interpretation of the calculus, by establishing that a term is convergent if and only if it has a nontrivial functional character.
A nondeterministic callbyneed lambda calculus
 INTERNATIONAL CONFERENCE ON FUNCTIONAL PROGRAMMING
, 1998
"... In this paper we present a nondeterministic callbyneed (untyped) lambda calculus nd with a constant choice and a letsyntax that models sharing. Our main result is that nd has the nice operational properties of the standard lambda calculus: confluence on sets of expressions, and normal order redu ..."
Abstract

Cited by 14 (7 self)
 Add to MetaCart
In this paper we present a nondeterministic callbyneed (untyped) lambda calculus nd with a constant choice and a letsyntax that models sharing. Our main result is that nd has the nice operational properties of the standard lambda calculus: confluence on sets of expressions, and normal order reduction is sufficient to reach head normal form. Using a strong contextual equivalence we show correctness of several program transformations. In particular of lambdalifting using deterministic maximal free expressions. These results show that nd is a new and also natural combination of nondeterminism and lambdacalculus, which has a lot of opportunities for parallel evaluation. An intended application of nd is as a foundation for compiling lazy functional programming languages with I/O based on direct calls. The set of correct program transformations can be rigorously distinguished from noncorrect ones. All program transformations are permitted with the slight exception that for transformations like common subexpression elimination and lambdalifting with maximal free expressions the involved subexpressions have to be deterministic ones.
Functional Computation as Concurrent Computation
, 1995
"... We investigate functional computation as a special form of concurrent computation. ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
We investigate functional computation as a special form of concurrent computation.
Evaluation under λAbstraction
, 1996
"... In light of the usual definition of values [15] as terms in weak head normal form (WHNF), a abstraction is regarded as a value, and therefore no expressions under abstraction can get evaluated and the sharing of computation under has to be achieved through program transformations such as lifting ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
In light of the usual definition of values [15] as terms in weak head normal form (WHNF), a abstraction is regarded as a value, and therefore no expressions under abstraction can get evaluated and the sharing of computation under has to be achieved through program transformations such as lifting and supercombinators. In this paper we generalise the notion of head normal form (HNF) and introduce the definition of generalised head normal form (GHNF). We then define values as terms in GHNF with flexible heads, and study a callbyvalue calculus v hd corresponding to this new notion of values. After establishing a version of normalisation theorem in v hd , we construct an evaluation function eval v hd for v hd which evaluates under  abstraction. We prove that a program can be evaluated in v hd to a term in GHNF if and only if it can be evaluated in the usual calculus to a term in HNF. We also present an operational semantics for v hd via a SECD machine. We argue that l...
Relative Normalization in Stable Deterministic Residual Structures
 Z. Khasidashvili and J. Glauert
, 1996
"... This paper generalizes the Huet and L'evy theory of normalization by neededness to an abstract setting. We define Stable Deterministic Residual Structures (SDRS) and Deterministic Family Structures (DFS) by axiomatizing some properties of the residual relation and the family relation on redexes in a ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
This paper generalizes the Huet and L'evy theory of normalization by neededness to an abstract setting. We define Stable Deterministic Residual Structures (SDRS) and Deterministic Family Structures (DFS) by axiomatizing some properties of the residual relation and the family relation on redexes in an Abstract Reduction System. We present two proofs of the Relative Normalization Theorem, one for SDRSs for regular stable sets, and another for DFSs for all stable sets of desirable `normal forms'. We further prove the Relative Optimality Theorem for DFSs. We extend this result to deterministic Computation Structures which are deterministic Prime Event Structures with an extra relation expressing (in)essentiality of events. A version of this paper appears in Proc. of CAAP'96 [GlKh96]. c fl J. Glauert & Z. Khasidashvili, UEA Norwich, 1996 1 Supported by the Engineering and Physical Sciences Research Council of Great Britain under grant GR/H 41300 1 Introduction A normalizable term, i...
A Partial Rehabilitation of SideEffecting I/O: NonDeterminism in NonStrict Functional Languages
, 1996
"... We investigate the extension of nonstrict functional languages like Haskell or Clean by a nondeterministic interaction with the external world. Using callbyneed and a natural semantics which describes the reduction of graphs, this can be done such that the ChurchRosser Theorems 1 and 2 hold. Ou ..."
Abstract
 Add to MetaCart
We investigate the extension of nonstrict functional languages like Haskell or Clean by a nondeterministic interaction with the external world. Using callbyneed and a natural semantics which describes the reduction of graphs, this can be done such that the ChurchRosser Theorems 1 and 2 hold. Our operational semantics is a base to recognise which particular equivalencies are preserved by program transformations. The amount of sequentialisation may be smaller than that enforced by other approaches, and the programming style is closer to the common one of sideeffecting programming. However, not all program transformations used by an optimising compiler for Haskell remain correct in all contexts. Our result can be interpreted as a possibility to extend current I/Omechanism by nondeterministic memoryless function calls. For example, this permits a call to a random number generator. Adding memoryless function calls to monadic I/O is possible and has a potential to extend the Haskell I...
Generalized λCalculi (Abstract)
"... ) Hongwei Xi Department of Mathematical Sciences, Carnegie Mellon University, Pittsburgh, PA 15213, USA email: hwxi@cs.cmu.edu Fax: +1 412 268 6380 We propose a notion of generalized calculi, which include the usual callbyname calculus, the usual callbyvalue calculus, and many other calc ..."
Abstract
 Add to MetaCart
) Hongwei Xi Department of Mathematical Sciences, Carnegie Mellon University, Pittsburgh, PA 15213, USA email: hwxi@cs.cmu.edu Fax: +1 412 268 6380 We propose a notion of generalized calculi, which include the usual callbyname calculus, the usual callbyvalue calculus, and many other calculi such as the g calculus[3], the v hd calculus[5], etc. We prove the ChurchRosser theorem and the standardization theorem for these generalized calculi. The normalization theorem then follows, which enables us to define evaluation functions for the generalized calculi. Our proof technique mainly establishes on the notion of separating developments[4], yielding intuitive and clean inductive proofs. This work aims at providing a solid foundation for evaluation under abstraction, a notion which is pervasive in both partial evaluation and runtime code generation for functional programming languages. Definition1. We use the following for terms and contexts: (terms) L; M;N ::= x j ...