Results 1  10
of
10
The origins of structural operational semantics
 Journal of Logic and Algebraic Programming
, 2004
"... We review the origins of structural operational semantics. The main publication ‘A Structural Approach to Operational Semantics, ’ also known as the ‘Aarhus Notes, ’ appeared in 1981 [G.D. Plotkin, A structural approach to operational semantics, DAIMI FN19, Computer Science Department, Aarhus Unive ..."
Abstract

Cited by 69 (0 self)
 Add to MetaCart
(Show Context)
We review the origins of structural operational semantics. The main publication ‘A Structural Approach to Operational Semantics, ’ also known as the ‘Aarhus Notes, ’ appeared in 1981 [G.D. Plotkin, A structural approach to operational semantics, DAIMI FN19, Computer Science Department, Aarhus University, 1981]. The development of the ideas dates back to the early 1970s, involving many people and building on previous work on programming languages and logic. The former included abstract syntax, the SECD machine, and the abstract interpreting machines of the Vienna school; the latter included the λcalculus and formal systems. The initial development of structural operational semantics was for simple functional languages, more or less variations of the λcalculus; after that the ideas were gradually extended to include languages with parallel features, such as Milner’s CCS. This experience set the ground for a more systematic exposition, the subject of an invited course of lectures at Aarhus University; some of these appeared in print as the 1981 Notes. We discuss the content of these lectures and some related considerations such as ‘small state’ versus ‘grand state, ’ structural versus compositional semantics, the influence of the Scott–Strachey approach to denotational semantics, the treatment of recursion and jumps, and static semantics. We next discuss relations with other work and some immediate further development. We conclude with an account of an old, previously unpublished, idea: an alternative, perhaps more readable, graphical presentation of systems of rules for operational semantics.
A Syntactic Characterization of the Equality in some Models for the Lambda Calculus
 J. London Math. Soc
, 1976
"... An equality relation on the terms of the Acalculus is an equivalence relation closed under the (syntactical) operations of application and Aabstraction. We may distinguish between syntactic and semantic ways of introducing equality relations, /^equality is introduced syntactically; it is the leas ..."
Abstract

Cited by 52 (0 self)
 Add to MetaCart
(Show Context)
An equality relation on the terms of the Acalculus is an equivalence relation closed under the (syntactical) operations of application and Aabstraction. We may distinguish between syntactic and semantic ways of introducing equality relations, /^equality is introduced syntactically; it is the least equality relation satisfying the
Descendants and Origins in Term Rewriting
"... In this paper we treat various aspects of a notion that is central in term rewriting, namely that of descendants or residuals. We address both first order term rewriting and calculus, their finitary as well as their infinitary variants. A recurrent theme is the Parallel Moves Lemma. Next to the ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
In this paper we treat various aspects of a notion that is central in term rewriting, namely that of descendants or residuals. We address both first order term rewriting and calculus, their finitary as well as their infinitary variants. A recurrent theme is the Parallel Moves Lemma. Next to the classical notion of descendant, we introduce an extended version, known as `origin tracking'. Origin tracking has many applications. Here it is employed to give new proofs of three classical theorems: the Genericity Lemma in calculus, the theorem of Huet and L'evy on needed reductions in first order term rewriting, and Berry's Sequentiality Theorem in (infinitary) calculus.
Applying Universal Algebra to Lambda Calculus
, 2007
"... The aim of this paper is double. From one side we survey the knowledge we have acquired these last ten years about the lattice of all λtheories ( = equational extensions of untyped λcalculus) and the models of lambda calculus via universal algebra. This includes positive or negative answers to se ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
The aim of this paper is double. From one side we survey the knowledge we have acquired these last ten years about the lattice of all λtheories ( = equational extensions of untyped λcalculus) and the models of lambda calculus via universal algebra. This includes positive or negative answers to several questions raised in these years as well as several independent results, the state of the art about the longstanding open questions concerning the representability of λtheories as theories of models, and 26 open problems. On the other side, against the common belief, we show that lambda calculus and combinatory logic satisfy interesting algebraic properties. In fact the Stone representation theorem for Boolean algebras can be generalized to combinatory algebras and λabstraction algebras. In every combinatory and λabstraction algebra there is a Boolean algebra of central elements (playing the role of idempotent elements in rings). Central elements are used to represent any combinatory and λabstraction algebra as a weak Boolean product of directly indecomposable algebras (i.e., algebras which cannot be decomposed as the Cartesian product of two other nontrivial algebras). Central elements are also used to provide applications of the representation theorem to lambda calculus. We show that the indecomposable semantics (i.e., the semantics of lambda calculus given in terms of models of lambda calculus, which are directly indecomposable as combinatory algebras) includes the continuous, stable and strongly stable semantics, and the term models of all semisensible λtheories. In one of the main results of the paper we show that the indecomposable semantics is equationally incomplete, and this incompleteness is as wide as possible.
Usability: Formalising (un)definedness in Typed Lambda Calculus
"... . In this paper we discuss usability, and propose to take that notion as a formalisation of (un)definedness in typed lambda calculus, especially in calculi based on PCF. We discuss some important properties that make usability attractive as a formalisation of (un)definedness. There is a remarkable d ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
. In this paper we discuss usability, and propose to take that notion as a formalisation of (un)definedness in typed lambda calculus, especially in calculi based on PCF. We discuss some important properties that make usability attractive as a formalisation of (un)definedness. There is a remarkable difference between usability and solvability: in the untyped lambda calculus the solvable terms are precisely the terms with a head normal form, whereas in typed lambda calculus the usable terms are "between" the terms with a normal form and the terms with a (weak) head normal form. 1 Introduction The elementary form of undefinedness arises on the level of natural numbers, when the evaluation of a (closed) term M of type Nat does not terminate, i.e., when M does not have a normal form. Such a term is also often called meaningless. However, for higher types it is not so evident which terms should be called meaningless. Analogous to the situation for ground types, it is often felt to be attrac...
A natural interpretation of classical proofs
, 2002
"... A natural interpretation of classical proofs ..."
Problem 19
"... Abstract. A closed λterm M is easy if, for any other closed term N, the lambda theory generated by M = N is consistent, while it is simple easy if, given an arbitrary intersection type τ, one can find a suitable preorder on types which allows to derive τ for M. Simple easiness implies easiness. Th ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. A closed λterm M is easy if, for any other closed term N, the lambda theory generated by M = N is consistent, while it is simple easy if, given an arbitrary intersection type τ, one can find a suitable preorder on types which allows to derive τ for M. Simple easiness implies easiness. The question whether easiness implies simple easiness constitutes Problem 19 in the TLCA list of open problems. In this paper we negatively answer the question providing a nonempty cor.e. (complement of a recursively enumerable) set of easy, but non simple easy, λterms. Key words: Lambda calculus, easy terms, simple easy terms, filter models 1
THE CONNECTION BETWEEN EQUIVALENCE OF PROOFS AND CARTESIAN CLOSED CATEGORIES
, 1974
"... There has recently been much interest shown in the connections between the two subjects of category theory and logic. This paper investigates one aspect of this, namely some of the simple connections between cartesian closed categories and the branch of logic known as proof theory. ..."
Abstract
 Add to MetaCart
(Show Context)
There has recently been much interest shown in the connections between the two subjects of category theory and logic. This paper investigates one aspect of this, namely some of the simple connections between cartesian closed categories and the branch of logic known as proof theory.
Under consideration for publication in Math. Struct. in Comp. Science Towards Böhm trees for lambdavalue: the
, 2012
"... The pure lambda calculus has a wellestablished ‘standard theory ’ in which the notion of solvability characterises the operational relevance of terms. Solvable terms, defined as solutions to a betaequation, have a ‘syntactic ’ characterisation as terms with head normal form. Unsolvable terms are i ..."
Abstract
 Add to MetaCart
The pure lambda calculus has a wellestablished ‘standard theory ’ in which the notion of solvability characterises the operational relevance of terms. Solvable terms, defined as solutions to a betaequation, have a ‘syntactic ’ characterisation as terms with head normal form. Unsolvable terms are irrelevant and can be betaequated without affecting consistency. The derived notions of sensibility and Böhm trees connect the consistent theory with models and with a representation of approximate normal forms. The lambdavalue calculus is the calculus that corresponds to a strict functional programming language whose operational semantics is defined by the SECD machine. The betaequational definition of solvability has been duly adapted to the pure lambdavalue calculus, but the syntactic characterisation (value head normal forms and the ahead machine) involves beta reduction and not betavalue reduction. The vunsolvables terms cannot be equated without affecting consistency, and some vnormal forms are vunsolvable and have to be considered irrelevant. This has been ignored in the context of weak reduction (not going under lambda, an ingredient of callbyvalue reduction as specified by the SECD machine) because of the existence of initial models