Results 1 
7 of
7
Complexity of strongly normalising λterms via nonidempotent intersection types
"... We present a typing system for the λcalculus, with nonidempotent intersection types. As it is the case in (some) systems with idempotent intersections, a λterm is typable if and only if it is strongly normalising. Nonidempotency brings some further information into typing trees, such as a bound o ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
We present a typing system for the λcalculus, with nonidempotent intersection types. As it is the case in (some) systems with idempotent intersections, a λterm is typable if and only if it is strongly normalising. Nonidempotency brings some further information into typing trees, such as a bound on the longest βreduction sequence reducing a term to its normal form. We actually present these results in Klop’s extension of λcalculus, where the bound that is read in the typing tree of a term is refined into an exact measure of the longest reduction sequence. This complexity result is, for longest reduction sequences, the counterpart of de Carvalho’s result for linear headreduction sequences.
Strong Normalisation of CutElimination that Simulates βReduction
"... This paper is concerned with strong normalisation of cutelimination for a standard intuitionistic sequent calculus. The cutelimination procedure is based on a rewrite system for proofterms with cutpermutation rules allowing the simulation of βreduction. Strong normalisation of the typed terms i ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
This paper is concerned with strong normalisation of cutelimination for a standard intuitionistic sequent calculus. The cutelimination procedure is based on a rewrite system for proofterms with cutpermutation rules allowing the simulation of βreduction. Strong normalisation of the typed terms is inferred from that of the simplytyped λcalculus, using the notions of safe and minimal reductions as well as a simulation in NederpeltKlop’s λIcalculus. It is also shown that the typefree terms enjoy the preservation of strong normalisation (PSN) property with respect to βreduction in an isomorphic image of the typefree λcalculus.
Towards a judgmental reconstruction of logical relation proofs
, 2006
"... Abstract. Tait’s method (a.k.a. proof by logical relations) is a powerful proof technique frequently used for showing foundational properties of languages based on typed lambdacalculi. Historically, these proofs have been difficult to formalize in proof assistants with weak metalogics, such as Twe ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Tait’s method (a.k.a. proof by logical relations) is a powerful proof technique frequently used for showing foundational properties of languages based on typed lambdacalculi. Historically, these proofs have been difficult to formalize in proof assistants with weak metalogics, such as Twelf. Logical relations are notoriously difficult to define judgmentally. In this paper, we present and discuss a Twelf proof of weak normalization for System F making use of higherorder encodings. We exhibit a modular technique on how to formalize proofs of this kind, and make explicit all logical principles that one needs to trust in order believe in the proof. 1
Delayed substitutions José Espírito Santo ⋆
"... Abstract. This paper investigates an approach to substitution alternative to the implicit treatment of the λcalculus and the explicit treatment of explicit substitution calculi. In this approach, substitutions are delayed (but not executed) explicitly. We implement this idea with two calculi, one w ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. This paper investigates an approach to substitution alternative to the implicit treatment of the λcalculus and the explicit treatment of explicit substitution calculi. In this approach, substitutions are delayed (but not executed) explicitly. We implement this idea with two calculi, one where substitution is a primitive construction of the calculus, the other where substitutions is represented by a βredex. For both calculi, confluence and (preservation of) strong normalisation are proved (the latter fails for a related system due to Revesz, as we show). Applications of delayed substitutions are of theoretical nature. The strong normalisation result implies strong normalisation for other calculi, like the computational lambdacalculus, lambdacalculi with generalised applications, or calculi of cutelimination for sequent calculus. We give an investigation of the computational interpretation of cutelimination in terms of generation, execution, and delaying of substitutions, paying particular attention to how generalised applications improve such interpretation. 1
Weak and Strong Normalization, Kredexes, and FirstOrder Logic
, 1999
"... Avoiding infinite loops is one of the obstacles most computer scientists must fight. Therefore the study of infinite loops and their opposite, termination, is important for our understanding of programs. This thesis studies certain aspects of these notions in the λcalculus. Being the foundation of ..."
Abstract
 Add to MetaCart
Avoiding infinite loops is one of the obstacles most computer scientists must fight. Therefore the study of infinite loops and their opposite, termination, is important for our understanding of programs. This thesis studies certain aspects of these notions in the λcalculus. Being the foundation of many modern functional languages it is a good theoretical framework for the study. The notions studied in this thesis have also interest due to the correspondence (through the CurryHoward isomorphism) to proof normalisation in the mathematical field of proof theory. Amongst other applications, proof normalisation is useful, if not crucial, for proofs of consistency, ie, that only true sentences can be derived. In the λcalculus, a program is called a term, the single steps used to evaluate a term are called reductions, and a term that cannot be reduced is said to be in a normal form. The latter corresponds to a value. This thesis begins with a study of the notions conservation and uniform normalisation. Conservation means that infinite reduction paths, ie, nonterminating ways of reducing the term, are preserved under reduction. Uniform normalisation means that the term can either not be reduced to a normal form,