Results 1  10
of
140
A New Deconstructive Logic: Linear Logic
, 1995
"... The main concern of this paper is the design of a noetherian and confluent normalization for LK 2 (that is, classical second order predicate logic presented as a sequent calculus). The method we present is powerful: since it allows us to recover as fragments formalisms as seemingly different a ..."
Abstract

Cited by 127 (11 self)
 Add to MetaCart
The main concern of this paper is the design of a noetherian and confluent normalization for LK 2 (that is, classical second order predicate logic presented as a sequent calculus). The method we present is powerful: since it allows us to recover as fragments formalisms as seemingly different as Girard's LC and Parigot's , FD ([9, 11, 27, 31]), delineates other viable systems as well, and gives means to extend the Krivine/Leivant paradigm of `programmingwithproofs' ([22, 23]) to classical logic; it is painless: since we reduce strong normalization and confluence to the same properties for linear logic (for nonadditive proof nets, to be precise) using appropriate embeddings (socalled decorations); it is unifying: it organizes known solutions in a simple pattern that makes apparent the how and why of their making. A comparison of our method to that of embedding LK into LJ (intuitionistic sequent calculus) brings to the fore the latter's defects for these `deconstructi...
Strong normalization and typability with intersection types
 Notre Dame Journal of Formal Logic
, 1996
"... Abstract A simple proof is given of the property that the set of strongly normalizing lambda terms coincides with the set of lambda terms typable in certain intersection type assignment systems. 1Introduction Intersection type assignment systems were introduced and developed in the 1980s by Barendre ..."
Abstract

Cited by 26 (9 self)
 Add to MetaCart
(Show Context)
Abstract A simple proof is given of the property that the set of strongly normalizing lambda terms coincides with the set of lambda terms typable in certain intersection type assignment systems. 1Introduction Intersection type assignment systems were introduced and developed in the 1980s by Barendregt, Coppo, DezaniCiancaglini and Venneri (see [2], [3], and [4]). They are meant to be extensions of Curry’s basic functional theory which will provide types for a larger class of lambda terms. On the one hand this aim was fulfilled, and on the other hand they became of interest for their other properties as well. We shall deal with four intersection type assignment systems: the original ones D and D � introduced in [3] and [4]and their extensions D ≤ and D� ≤ with the rule (≤), which involves partial ordering on types. The problem of typability in a type system is whether there is a type for a given term. The problem of typability in the full intersection type assignment system D�≤ is trivial, since every lambda term is typable by the type ω. For the same reasons typability in D � is trivial as well. This property changes essentially when the (ω)rule is left out. It turns out that all strongly normalizing lambda terms are typable in D ≤ and D, and they are the only terms typable in these systems (see Krivine [9] and van Bakel [15]). The idea that strongly normalizing lambda terms are exactly the terms typable in the intersection type assignment systems without the (ω)rule first appeared in [4], Pottinger [11], and Leivant [10]. Further, this subject is treated in [15], [9], and Ronchi della Rocca et al. [12], with different approaches. We shall present a modified proof of this property and compare it with the proofs mentioned above. Section 2 is an overview of the systems considered. In Section 3 we shall present a proof àlaTait of strong normalization for D and D ≤ based on the proof of strong
On The Algebraic Models Of Lambda Calculus
 Theoretical Computer Science
, 1997
"... . The variety (equational class) of lambda abstraction algebras was introduced to algebraize the untyped lambda calculus in the same way Boolean algebras algebraize the classical propositional calculus. The equational theory of lambda abstraction algebras is intended as an alternative to combinatory ..."
Abstract

Cited by 24 (11 self)
 Add to MetaCart
. The variety (equational class) of lambda abstraction algebras was introduced to algebraize the untyped lambda calculus in the same way Boolean algebras algebraize the classical propositional calculus. The equational theory of lambda abstraction algebras is intended as an alternative to combinatory logic in this regard since it is a firstorder algebraic description of lambda calculus, which allows to keep the lambda notation and hence all the functional intuitions. In this paper we show that the lattice of the subvarieties of lambda abstraction algebras is isomorphic to the lattice of lambda theories of the lambda calculus; for every variety of lambda abstraction algebras there exists exactly one lambda theory whose term algebra generates the variety. For example, the variety generated by the term algebra of the minimal lambda theory is the variety of all lambda abstraction algebras. This result is applied to obtain a generalization of the genericity lemma of finitary lambda calculus...
A General Storage Theorem for Integers in CallByName λCalculus
, 1993
"... The notion of storage operator introduced in [5, 6] appears to be an important tool in the study of data types in second order λcalculus. These operators are λterms which simulate callbyvalue in the callbyname strategy, and they can be used in order to modelize assignment instructions. The mai ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
The notion of storage operator introduced in [5, 6] appears to be an important tool in the study of data types in second order λcalculus. These operators are λterms which simulate callbyvalue in the callbyname strategy, and they can be used in order to modelize assignment instructions. The main result about storage operators is that there is a very simple second order type for them, using Gödel's "notnot translation" of classical into intuitionistic logic. We give here a new and simpler proof of a strengthened version of this theorem, which contains all previous results in intuitionistic and in classical logic ([6, 7]), and gives rise to new "storage theorems". Moreover, this result has a simple and intuitive meaning, in terms of realizability.
Intersection types for explicit substitutions
, 2003
"... We present a new system of intersection types for a compositionfree calculus of explicit substitutions with a rule for garbage collection, and show that it characterizes those terms which are strongly normalizing. This system extends previous work on the natural generalization of the classical inte ..."
Abstract

Cited by 22 (8 self)
 Add to MetaCart
(Show Context)
We present a new system of intersection types for a compositionfree calculus of explicit substitutions with a rule for garbage collection, and show that it characterizes those terms which are strongly normalizing. This system extends previous work on the natural generalization of the classical intersection types system, which characterized head normalization and weak normalization, but was not complete for strong normalization. An important role is played by the notion of available variable in a term, which is a generalization of the classical notion of free variable.
Compositional Characterizations of λterms using Intersection Types (Extended Abstract)
, 2000
"... We show how to characterize compositionally a number of evaluation properties of λterms using Intersection Type assignment systems. In particular, we focus on termination properties, such as strong normalization, normalization, head normalization, and weak head normalization. We consider also the ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
We show how to characterize compositionally a number of evaluation properties of λterms using Intersection Type assignment systems. In particular, we focus on termination properties, such as strong normalization, normalization, head normalization, and weak head normalization. We consider also the persistent versions of such notions. By way of example, we consider also another evaluation property, unrelated to termination, namely reducibility to a closed term. Many of these characterization results are new, to our knowledge, or else they streamline, strengthen, or generalize earlier results in the literature. The completeness parts of the characterizations are proved uniformly for all the properties, using a settheoretical semantics of intersection types over suitable kinds of stable sets. This technique generalizes Krivine 's and Mitchell's methods for strong normalization to other evaluation properties.
A CPSTranslation of the λµCalculus
, 1994
"... We present a translation of Parigot's λµcalculus [10] into the usual λcalculus. This translation, which is based on the socalled continuation passing style, is correct with respect to equality and with respect to evaluation. At the type level, it induces a logical interpretation of classica ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
We present a translation of Parigot's λµcalculus [10] into the usual λcalculus. This translation, which is based on the socalled continuation passing style, is correct with respect to equality and with respect to evaluation. At the type level, it induces a logical interpretation of classical logic into intuitionistic one, akin to Kolmogorov's negative translation. As a byproduct, we get the normalization of second order typed λµcalculus.
The Algebraic LambdaCalculus
 UNDER CONSIDERATION FOR PUBLICATION IN MATH. STRUCT. IN COMP. SCIENCE
, 2009
"... We introduce an extension of the pure lambdacalculus by endowing the set of terms with a structure of vector space, or more generally of module, over a fixed set of scalars. Terms are moreover subject to identities similar to usual pointwise definition of linear combinations of functions with value ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
We introduce an extension of the pure lambdacalculus by endowing the set of terms with a structure of vector space, or more generally of module, over a fixed set of scalars. Terms are moreover subject to identities similar to usual pointwise definition of linear combinations of functions with values in a vector space. We then study a natural extension of betareduction in this setting: we prove it is confluent, then discuss consistency and conservativity over the ordinary lambdacalculus. We also provide normalization results for a simple type system.
The maximality of the typed lambda calculus and of cartesian closed categories
 Publ. Inst. Math. (N.S
"... From the analogue of Böhm’s Theorem proved for the typed lambda calculus, without product types and with them, it is inferred that every cartesian closed category that satisfies an equality between arrows not satisfied in free cartesian closed categories must be a preorder. A new proof is given here ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
(Show Context)
From the analogue of Böhm’s Theorem proved for the typed lambda calculus, without product types and with them, it is inferred that every cartesian closed category that satisfies an equality between arrows not satisfied in free cartesian closed categories must be a preorder. A new proof is given here of these results, which were obtained previously by Richard Statman and Alex K. Simpson.