Results 1  10
of
19
The Impact of the Lambda Calculus in Logic and Computer Science
 Bulletin of Symbolic Logic
, 1997
"... One of the most important contributions of A. Church to logic is his invention of the lambda calculus. We present the genesis of this theory and its two major areas of application: the representation of computations and the resulting functional programming languages on the one hand and the represent ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
One of the most important contributions of A. Church to logic is his invention of the lambda calculus. We present the genesis of this theory and its two major areas of application: the representation of computations and the resulting functional programming languages on the one hand and the representation of reasoning and the resulting systems of computer mathematics on the other hand. Acknowledgement. The following persons provided help in various ways. Erik Barendsen, Jon Barwise, Johan van Benthem, Andreas Blass, Olivier Danvy, Wil Dekkers, Marko van Eekelen, Sol Feferman, Andrzej Filinski, Twan Laan, Jan Kuper, Pierre Lescanne, Hans Mooij, Robert Maron, Rinus Plasmeijer, Randy Pollack, Kristoffer Rose, Richard Shore, Rick Statman and Simon Thompson. Partial support came from the European HCM project Typed lambda calculus (CHRXCT920046), the Esprit Working Group Types (21900) and the Dutch NWO project WINST (612316607). 1. Introduction This paper is written to honor Church's gr...
Relating Typability and Expressiveness in FiniteRank Intersection Type Systems (Extended Abstract)
 In Proc. 1999 Int’l Conf. Functional Programming
, 1999
"... We investigate finiterank intersection type systems, analyzing the complexity of their type inference problems and their relation to the problem of recognizing semantically equivalent terms. Intersection types allow something of type T1 /\ T2 to be used in some places at type T1 and in other places ..."
Abstract

Cited by 21 (9 self)
 Add to MetaCart
We investigate finiterank intersection type systems, analyzing the complexity of their type inference problems and their relation to the problem of recognizing semantically equivalent terms. Intersection types allow something of type T1 /\ T2 to be used in some places at type T1 and in other places at type T2 . A finiterank intersection type system bounds how deeply the /\ can appear in type expressions. Such type systems enjoy strong normalization, subject reduction, and computable type inference, and they support a pragmatics for implementing parametric polymorphism. As a consequence, they provide a conceptually simple and tractable alternative to the impredicative polymorphism of System F and its extensions, while typing many more programs than the HindleyMilner type system found in ML and Haskell. While type inference is computable at every rank, we show that its complexity grows exponentially as rank increases. Let K(0, n) = n and K(t + 1, n) = 2^K(t,n); we prove that recognizing the pure lambdaterms of size n that are typable at rank k is complete for dtime[K(k1, n)]. We then consider the problem of deciding whether two lambdaterms typable at rank k have the same normal form, Generalizing a wellknown result of Statman from simple types to finiterank intersection types. ...
Dependent Types from Counterexamples
, 2010
"... Motivated by recent research in abstract model checking, we present a new approach to inferring dependent types. Unlike many of the existing approaches, our approach does not rely on programmers to supply the candidate (or the correct) types for the recursive functions and instead does counterexampl ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Motivated by recent research in abstract model checking, we present a new approach to inferring dependent types. Unlike many of the existing approaches, our approach does not rely on programmers to supply the candidate (or the correct) types for the recursive functions and instead does counterexampleguided refinement to automatically generate the set of candidate dependent types. The main idea is to extend the classical fixedpoint type inference routine to return a counterexample if the program is found untypable with the current set of candidate types. Then, an interpolating theorem prover is used to validate the counterexample as a real type error or generate additional candidate dependent types to refute the spurious counterexample. The process is repeated until either a real type error is found or sufficient candidates are generated to prove the program typable. Our system makes nontrivial use of “linear” intersection types in the refinement phase. The paper presents the type inference system and reports on the experience with a prototype implementation that infers dependent types for a subset of the Ocaml language. The implementation infers dependent types containing predicates from the quantifierfree theory of linear arithmetic and equality with uninterpreted function symbols.
Types, potency, and idempotency: why nonlinearity and amnesia make a type system work
 In ICFP ’04: Proceedings of the ninth ACM SIGPLAN international conference on Functional programming, 138–149, ACM
, 2004
"... Useful type inference must be faster than normalization. Otherwise, you could check safety conditions by running the program. We analyze the relationship between bounds on normalization and type inference. We show how the success of type inference is fundamentally related to the amnesia of the type ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Useful type inference must be faster than normalization. Otherwise, you could check safety conditions by running the program. We analyze the relationship between bounds on normalization and type inference. We show how the success of type inference is fundamentally related to the amnesia of the type system: the nonlinearity by which all instances of a variable are constrained to have the same type. Recent work on intersection types has advocated their usefulness for static analysis and modular compilation. We analyze SystemI (and some instances of its descendant, System E), an intersection type system with a type inference algorithm. Because SystemI lacks idempotency, each occurrence of a variable requires a distinct type. Consequently, type inference is equivalent to normalization in every single case, and time bounds on type inference and normalization are identical. Similar relationships hold for other intersection type systems without idempotency. The analysis is founded on an investigation of the relationship between linear logic and intersection types. We show a lockstep correspondence between normalization and type inference. The latter shows the promise of intersection types to facilitate static analyses of varied granularity, but also belies an immense challenge: to add amnesia to such analysis without losing all of its benefits.
The safe lambda calculus
 of Lecture Notes in Computer Science
, 2007
"... Abstract. Safety is a syntactic condition of higherorder grammars that constrains occurrences of variables in the production rules according to their typetheoretic order. In this paper, we introduce the safe lambda calculus, which is obtained by transposing (and generalizing) the safety condition ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Abstract. Safety is a syntactic condition of higherorder grammars that constrains occurrences of variables in the production rules according to their typetheoretic order. In this paper, we introduce the safe lambda calculus, which is obtained by transposing (and generalizing) the safety condition to the setting of the simplytyped lambda calculus. In contrast to the original definition of safety, our calculus does not constrain types (to be homogeneous). We show that in the safe lambda calculus, there is no need to rename bound variables when performing substitution, as variable capture is guaranteed not to happen. We also propose an adequate notion of βreduction that preserves safety. In the same vein as Schwichtenberg’s 1976 characterization of the simplytyped lambda calculus, we show that the numeric functions representable in the safe lambda calculus are exactly the multivariate polynomials; thus conditional is not definable. We also give a characterization of representable word functions. We then study the complexity of deciding betaeta equality of two safe simplytyped terms and show that this problem is PSPACEhard. Finally we give a gamesemantic analysis of safety: We show that safe terms are denoted by Pincrementally justified strategies. Consequently pointers in the game semantics of safe λterms are only necessary from order 4 onwards.
LAL is square: Representation and expressiveness in light affine logic
 In Proc. Workshop on Implicit Computational Complexity
, 2002
"... Abstract. We focus on how the choice of inputoutput representation has a crucial impact on the expressiveness of socalled “logics of polynomial time. ” Our analysis illustrates this dependence in the context of Light Affine Logic (LAL), which is both a restricted version of Linear Logic, and a pri ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. We focus on how the choice of inputoutput representation has a crucial impact on the expressiveness of socalled “logics of polynomial time. ” Our analysis illustrates this dependence in the context of Light Affine Logic (LAL), which is both a restricted version of Linear Logic, and a primitive functional programming language with restricted sharing of arguments. By slightly relaxing representation conventions, we derive doublyexponential expressiveness bounds for this “logic of polynomial time. ” We emphasize that squaring is the unifying idea that relates upper bounds on cut elimination for LAL with lower bounds on representation. Representation issues arise in the simulation of DTIME[2 2n], where we construct a uniform family of proofnets encoding a Turing Machine; specifically, the dependence on n only affects the number of enclosing boxes. A related technical improvement is the simulation of DTIME[n k]indepthO(log k) LAL proofnets. The resulting upper bounds on cut elimination then satisfy the properties of a
Typed Logics With States
, 1997
"... The paper presents a simple format for typed logics with states by adding a function for register update to standard typed lambda calculus. It is shown that universal validity of equality for this extended language is decidable (extending a wellknown result of Friedman for typed lambda calculus). T ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
The paper presents a simple format for typed logics with states by adding a function for register update to standard typed lambda calculus. It is shown that universal validity of equality for this extended language is decidable (extending a wellknown result of Friedman for typed lambda calculus). This system is next extended to a full fledged typed dynamic logic, and it is illustrated how the resulting format allows for very simple and intuitive representations of dynamic semantics for natural language and denotational semantics for imperative programming. The proposal is compared with some alternative approaches to formulating typed versions of dynamic logics. 1991 Mathematics Subject Classification: 03B15, 03B65, 03B70, 68Q55, 68Q65 1991 Computing Reviews Classification System: D.3.3, F.3.2, I.2.4, I.2.7 Keywords and Phrases: Type theory, compositionality, denotational semantics, dynamic semantics Note: Work carried out under project P4303; paper accepted for publication in the ...
Least Upper Bounds on the Size of ChurchRosser Diagrams in Term Rewriting and λCalculus
"... Abstract. We study the ChurchRosser property—which is also known as confluence—in term rewriting and λcalculus. Given a system R and a peak t ∗ ← s → ∗ t ′ in R, we are interested in the length of the reductions in the smallest corresponding valley t → ∗ s ′ ∗ ← t ′ as a function vsR(m, n) of ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. We study the ChurchRosser property—which is also known as confluence—in term rewriting and λcalculus. Given a system R and a peak t ∗ ← s → ∗ t ′ in R, we are interested in the length of the reductions in the smallest corresponding valley t → ∗ s ′ ∗ ← t ′ as a function vsR(m, n) of the size m of s and the maximum length n of the reductions in the peak. For confluent term rewriting systems (TRSs), we prove the (expected) result that vsR(m, n) is a computable function. Conversely, for every total computable function ϕ(n) there is a TRS with a single term s such that vsR(s, n) ≥ ϕ(n) for all n. In contrast, for orthogonal term rewriting systems R we prove that there is a constant k such that vsR(m, n) is bounded from above by a function exponential in k and independent of the size of s. For λcalculus, we show that vsR(m, n) is bounded from above by a function contained in the fourth level of the Grzegorczyk hierarchy. 1