Results 1 
3 of
3
A feasible algorithm for typing in elementary affine logic
 In Proceedings of the 8th International Conference on Typed LambdaCalculi and Applications
, 2005
"... We give a new type inference algorithm for typing lambdaterms in Elementary Affine Logic (EAL), which is motivated by applications to complexity and optimal reduction. Following previous references on this topic, the variant of EAL type system we consider (denoted EAL ⋆ ) is a variant without shari ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
We give a new type inference algorithm for typing lambdaterms in Elementary Affine Logic (EAL), which is motivated by applications to complexity and optimal reduction. Following previous references on this topic, the variant of EAL type system we consider (denoted EAL ⋆ ) is a variant without sharing and without polymorphism. Our algorithm improves over the ones already known in that it offers a better complexity bound: if a simple type derivation for the term t is given our algorithm performs EAL ⋆ type inference in polynomial time. 1
Complexity of strongly normalising λterms via nonidempotent intersection types
"... We present a typing system for the λcalculus, with nonidempotent intersection types. As it is the case in (some) systems with idempotent intersections, a λterm is typable if and only if it is strongly normalising. Nonidempotency brings some further information into typing trees, such as a bound o ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We present a typing system for the λcalculus, with nonidempotent intersection types. As it is the case in (some) systems with idempotent intersections, a λterm is typable if and only if it is strongly normalising. Nonidempotency brings some further information into typing trees, such as a bound on the longest βreduction sequence reducing a term to its normal form. We actually present these results in Klop’s extension of λcalculus, where the bound that is read in the typing tree of a term is refined into an exact measure of the longest reduction sequence. This complexity result is, for longest reduction sequences, the counterpart of de Carvalho’s result for linear headreduction sequences.
Filter models: nonidempotent intersection types, orthogonality and polymorphism
"... This paper revisits models of typed λcalculus based on filters of intersection types: By using nonidempotent intersections, we simplify a methodology that produces modular proofs of strong normalisation based on filter models. Nonidempotent intersections provide a decreasing measure proving a key ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper revisits models of typed λcalculus based on filters of intersection types: By using nonidempotent intersections, we simplify a methodology that produces modular proofs of strong normalisation based on filter models. Nonidempotent intersections provide a decreasing measure proving a key termination property, simpler than the reducibility techniques used with idempotent intersections. Such filter models are shown to be captured by orthogonality techniques: we formalise an abstract notion of orthogonality model inspired by classical realisability, and express a filter model as one of its instances, along with two termmodels (one of which captures a now common technique for strong normalisation). Applying the above range of model constructions to Currystyle System F describes at different levels of detail how the infinite polymorphism of System F can systematically be reduced to the finite polymorphism of intersection types.