Results 1 
8 of
8
Relating Typability and Expressiveness in FiniteRank Intersection Type Systems (Extended Abstract)
 In Proc. 1999 Int’l Conf. Functional Programming
, 1999
"... We investigate finiterank intersection type systems, analyzing the complexity of their type inference problems and their relation to the problem of recognizing semantically equivalent terms. Intersection types allow something of type T1 /\ T2 to be used in some places at type T1 and in other places ..."
Abstract

Cited by 21 (9 self)
 Add to MetaCart
We investigate finiterank intersection type systems, analyzing the complexity of their type inference problems and their relation to the problem of recognizing semantically equivalent terms. Intersection types allow something of type T1 /\ T2 to be used in some places at type T1 and in other places at type T2 . A finiterank intersection type system bounds how deeply the /\ can appear in type expressions. Such type systems enjoy strong normalization, subject reduction, and computable type inference, and they support a pragmatics for implementing parametric polymorphism. As a consequence, they provide a conceptually simple and tractable alternative to the impredicative polymorphism of System F and its extensions, while typing many more programs than the HindleyMilner type system found in ML and Haskell. While type inference is computable at every rank, we show that its complexity grows exponentially as rank increases. Let K(0, n) = n and K(t + 1, n) = 2^K(t,n); we prove that recognizing the pure lambdaterms of size n that are typable at rank k is complete for dtime[K(k1, n)]. We then consider the problem of deciding whether two lambdaterms typable at rank k have the same normal form, Generalizing a wellknown result of Statman from simple types to finiterank intersection types. ...
Cut Rules and Explicit Substitutions
, 2000
"... this paper deals exclusively with intuitionistic logic (in fact, only the implicative fragment), we require succedents to be a single consequent formula. Natural deduction systems, which we choose to call Nsystems, are symbolic logics generally given via introduction and elimination rules for the l ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
this paper deals exclusively with intuitionistic logic (in fact, only the implicative fragment), we require succedents to be a single consequent formula. Natural deduction systems, which we choose to call Nsystems, are symbolic logics generally given via introduction and elimination rules for the logical connectives which operate on the right, i.e., they manipulate the succedent formula. Examples are Gentzen's NJ and NK (Gentzen 1935). Logical deduction systems are given via leftintroduction and rightintroduction rules for the logical connectives. Although others have called these systems "sequent calculi", we call them Lsystems to avoid confusion with other systems given in sequent style. Examples are Gentzen's LK and LJ (Gentzen 1935). In this paper we are primarily interested in Lsystems. The advantage of Nsystems is that they seem closer to actual reasoning, while Lsystems on the other hand seem to have an easier proof theory. Lsystems are often extended with a "cut" rule as part of showing that for a given Lsystem and Nsystem, the derivations of each system can be encoded in the other. For example, NK proves the same as LK + cut (Gentzen 1935). Proof Normalization. A system is consistent when it is impossible to prove false, i.e., derive absurdity from zero assumptions. A system is analytic (has the analycity property) when there is an e#ective method to decompose any conclusion sequent into simpler premise sequents from which the conclusion can be obtained by some rule in the system such that the conclusion is derivable i# the premises are derivable (Maenpaa 1993). To achieve the goals of consistency and analycity, it has been customary to consider
Calculi of Generalised βReduction and Explicit Substitutions: The TypeFree and Simply Typed Versions
, 1998
"... Extending the λcalculus with either explicit substitution or generalized reduction has been the subject of extensive research recently, and still has many open problems. This paper is the first investigation into the properties of a calculus combining both generalized reduction and explicit substit ..."
Abstract

Cited by 14 (7 self)
 Add to MetaCart
Extending the λcalculus with either explicit substitution or generalized reduction has been the subject of extensive research recently, and still has many open problems. This paper is the first investigation into the properties of a calculus combining both generalized reduction and explicit substitutions. We present a calculus, gs, that combines a calculus of explicit substitution, s, and a calculus with generalized reduction, g. We believe that gs is a useful extension of the  calculus, because it allows postponement of work in two different but complementary ways. Moreover, gs (and also s) satisfies properties desirable for calculi of explicit substitutions and generalized reductions. In particular, we show that gs preserves strong normalization, is a conservative extension of g, and simulates fireduction of g and the classical calculus. Furthermore, we study the simply typed versions of s and gs, and show that welltyped terms are strongly normalizing and that other properties,...
HigherOrder Families
 In International Conference on Rewriting Techniques and Applications '96, LNCS
, 1996
"... A redex family is a set of redexes which are `created in the same way'. Families specify which redexes should be shared in any socalled optimal implementation of a rewriting system. We formalise the notion of family for orthogonal higherorder term rewriting systems (OHRSs). In order to comfort our ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
A redex family is a set of redexes which are `created in the same way'. Families specify which redexes should be shared in any socalled optimal implementation of a rewriting system. We formalise the notion of family for orthogonal higherorder term rewriting systems (OHRSs). In order to comfort our formalisation of the intuitive concept of family, we actually provide three conceptually different formalisations, via labelling, extraction and zigzag and show them to be equivalent. This generalises the results known from literature and gives a firm theoretical basis for the optimal implementation of OHRSs. 1. Introduction A computation of a result is optimal if its cost is minimal among all computations of the result. Taking rewrite steps as computational units the cost of a rewrite sequence is simply its length. Given a rewrite system the question then is: does an effective optimal strategy exist for it? In the case of lambda calculus, a discouraging result was obtained in [BBKV76]: th...
BetaReduction As Unification
, 1996
"... this report, we use a lean version of the usual system of intersection types, whichwe call . Hence, UP is also an appropriate unification problem to characterize typability of terms in . Quite apart from the new light it sheds on fireduction, such an analysis turns out to have several othe ..."
Abstract

Cited by 13 (9 self)
 Add to MetaCart
this report, we use a lean version of the usual system of intersection types, whichwe call . Hence, UP is also an appropriate unification problem to characterize typability of terms in . Quite apart from the new light it sheds on fireduction, such an analysis turns out to have several other benefits
New Notions of Reduction and NonSemantic Proofs of Strong βNormalization in Typed λCalculi
 PROCEEDINGS OF LOGIC IN COMPUTER SCIENCE
, 1995
"... Two notions of reduction for terms of the λcalculus are introduced and the question of whether a λterm is βstrongly normalizing is reduced to the question of whether a λterm is merely normalizing under one of the notions of reduction. This gives a method to prove strong βnormalization for typ ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
Two notions of reduction for terms of the λcalculus are introduced and the question of whether a λterm is βstrongly normalizing is reduced to the question of whether a λterm is merely normalizing under one of the notions of reduction. This gives a method to prove strong βnormalization for typed λcalculi. Instead of the usual semantic proof style based on Tait's realizability or Girard's "candidats de réductibilité", termination can be proved using a decreasing metric over a wellfounded ordering. This proof method is applied to the simplytyped λcalculus and the system of intersection types, giving the first nonsemantic proof for a polymorphic extension of the λcalculus.
A Linearization of the LambdaCalculus and Consequences
, 2000
"... We embed the standard #calculus, denoted #, into two larger #calculi, denoted # # and &# # . The standard notion of #reduction for # corresponds to two new notions of reduction, # # for # # and &# # for &# # . A distinctive feature of our new calculus # # (resp., &# # ) is that, in every function ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We embed the standard #calculus, denoted #, into two larger #calculi, denoted # # and &# # . The standard notion of #reduction for # corresponds to two new notions of reduction, # # for # # and &# # for &# # . A distinctive feature of our new calculus # # (resp., &# # ) is that, in every function application, an argument is used at most once (resp., exactly once) in the body of the function. We establish various connections between the three notions of reduction, #, # # and &# # . As a consequence, we provide an alternative framework to study the relationship between #weak normalization and #strong normalization, and give a new proof of the oftmentioned equivalence between #strong normalization of standard #terms and typability in a system of "intersection types".
Calculi of Generalised betaReduction and Explicit Substitutions: The TypeFree and Simply Typed Versions
, 1997
"... Extending the calculus with either explicit substitution or generalised reduction has been the subject of extensive research recently and still has many open problems. This paper is the first investigation into the properties of a calculus combining both generalised reduction and explicit substitut ..."
Abstract
 Add to MetaCart
Extending the calculus with either explicit substitution or generalised reduction has been the subject of extensive research recently and still has many open problems. This paper is the first investigation into the properties of a calculus combining both generalised reduction and explicit substitutions. We present a calculus, gs, that combines a calculus of explicit substitution, s, and a calculus with generalized reduction, g. We believe that gs is a useful extension of the calculus because it allows postponment of work in two different but complementary ways. Moreover, gs (and also s) satisfies desirable properties of calculi of explicit substitutions and generalised reductions. In particular, we show that gs preserves strong normalisation, is a conservative extension of g, and simulates fireduction of g and the classical calculus. Furthermore, we study the simply typed versions of s and gs and show that well typed terms are strongly normalising and that other properties such as...