Results 1 
8 of
8
Expansion: the Crucial Mechanism for Type Inference with Intersection Types: Survey and Explanation
 In: (ITRS ’04
, 2005
"... The operation of expansion on typings was introduced at the end of the 1970s by Coppo, Dezani, and Venneri for reasoning about the possible typings of a term when using intersection types. Until recently, it has remained somewhat mysterious and unfamiliar, even though it is essential for carrying ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
The operation of expansion on typings was introduced at the end of the 1970s by Coppo, Dezani, and Venneri for reasoning about the possible typings of a term when using intersection types. Until recently, it has remained somewhat mysterious and unfamiliar, even though it is essential for carrying out compositional type inference. The fundamental idea of expansion is to be able to calculate the effect on the final judgement of a typing derivation of inserting a use of the intersectionintroduction typing rule at some (possibly deeply nested) position, without actually needing to build the new derivation.
Dependent Types from Counterexamples
, 2010
"... Motivated by recent research in abstract model checking, we present a new approach to inferring dependent types. Unlike many of the existing approaches, our approach does not rely on programmers to supply the candidate (or the correct) types for the recursive functions and instead does counterexampl ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Motivated by recent research in abstract model checking, we present a new approach to inferring dependent types. Unlike many of the existing approaches, our approach does not rely on programmers to supply the candidate (or the correct) types for the recursive functions and instead does counterexampleguided refinement to automatically generate the set of candidate dependent types. The main idea is to extend the classical fixedpoint type inference routine to return a counterexample if the program is found untypable with the current set of candidate types. Then, an interpolating theorem prover is used to validate the counterexample as a real type error or generate additional candidate dependent types to refute the spurious counterexample. The process is repeated until either a real type error is found or sufficient candidates are generated to prove the program typable. Our system makes nontrivial use of “linear” intersection types in the refinement phase. The paper presents the type inference system and reports on the experience with a prototype implementation that infers dependent types for a subset of the Ocaml language. The implementation infers dependent types containing predicates from the quantifierfree theory of linear arithmetic and equality with uninterpreted function symbols.
Execution Time of λTerms via Denotational Semantics and Intersection Types. Research Report RR6638
, 2008
"... The multiset based relational model of linear logic induces a semantics of the type free λcalculus, which corresponds to a nonidempotent intersection type system, System R. We prove that, in System R, the size of the type derivations and the size of the types are closely related to the execution t ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The multiset based relational model of linear logic induces a semantics of the type free λcalculus, which corresponds to a nonidempotent intersection type system, System R. We prove that, in System R, the size of the type derivations and the size of the types are closely related to the execution time of λterms in a particular environment machine, Krivine’s machine.
Elaborating Intersection and Union Types
"... Designing and implementing typed programming languages is hard. Every new type system feature requires extending the metatheory and implementation, which are often complicated and fragile. To ease this process, we would like to provide general mechanisms that subsume many different features. In mode ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Designing and implementing typed programming languages is hard. Every new type system feature requires extending the metatheory and implementation, which are often complicated and fragile. To ease this process, we would like to provide general mechanisms that subsume many different features. In modern type systems, parametric polymorphism is fundamental, but intersection polymorphism has gained little traction in programming languages. Most practical intersection type systems have supported only refinement intersections, which increase the expressiveness of types (more precise properties can be checked) without altering the expressiveness of terms; refinement intersections can simply be erased during compilation. In contrast, unrestricted intersections increase the expressiveness of terms, and can be used to encode diverse language features, promising an economy of both theory and implementation. We describe a foundation for compiling unrestricted intersection and union types: an elaboration type system that generates ordinary λcalculus terms. The key feature is a Forsythelike merge construct. With this construct, not all reductions of the source program preserve types; however, we prove that ordinary callbyvalue evaluation of the elaborated program corresponds to a typepreserving evaluation of the source program. We also describe a prototype implementation and applications of unrestricted intersections and unions: records, operator overloading, and simulating dynamic typing. 1.
Filter models: nonidempotent intersection types, orthogonality and polymorphism
"... This paper revisits models of typed λcalculus based on filters of intersection types: By using nonidempotent intersections, we simplify a methodology that produces modular proofs of strong normalisation based on filter models. Nonidempotent intersections provide a decreasing measure proving a key ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper revisits models of typed λcalculus based on filters of intersection types: By using nonidempotent intersections, we simplify a methodology that produces modular proofs of strong normalisation based on filter models. Nonidempotent intersections provide a decreasing measure proving a key termination property, simpler than the reducibility techniques used with idempotent intersections. Such filter models are shown to be captured by orthogonality techniques: we formalise an abstract notion of orthogonality model inspired by classical realisability, and express a filter model as one of its instances, along with two termmodels (one of which captures a now common technique for strong normalisation). Applying the above range of model constructions to Currystyle System F describes at different levels of detail how the infinite polymorphism of System F can systematically be reduced to the finite polymorphism of intersection types.
Innocent Game Semantics via Intersection Type Assignment Systems ∗
"... The aim of this work is to correlate two different approaches to the semantics of programming languages: game semantics and intersection type assignment systems (ITAS). Namely, we present an ITAS that provides the description of the semantic interpretation of a typed lambda calculus in a game model ..."
Abstract
 Add to MetaCart
The aim of this work is to correlate two different approaches to the semantics of programming languages: game semantics and intersection type assignment systems (ITAS). Namely, we present an ITAS that provides the description of the semantic interpretation of a typed lambda calculus in a game model based on innocent strategies. Compared to the traditional ITAS used to describe the semantic interpretation in domain theoretic models, the ITAS presented in this paper has two main differences: the introduction of a notion of labelling on moves, and the omission of several rules, i.e. the subtyping rules and some structural rules.
Deciding kCFA is Complete for . . .
, 2008
"... We give an exact characterization of the computational complexity of the kCFA hierarchy. For any k> 0, we prove that the control flow decision problem is complete for deterministic exponential time. This theorem validates empirical observations that such control flow analysis is intractable. It a ..."
Abstract
 Add to MetaCart
We give an exact characterization of the computational complexity of the kCFA hierarchy. For any k> 0, we prove that the control flow decision problem is complete for deterministic exponential time. This theorem validates empirical observations that such control flow analysis is intractable. It also provides more general insight into the complexity of abstract interpretation.
Unification with Expansion Variables: Preliminary Results and Problems ⋆
"... Abstract. Expansion generalises substitution. An expansion is a special term whose leaves can be substitutions. Substitutions map term variables to ordinary terms and expansion variables to expansions. Expansions (resp., ordinary terms) may contain expansion variables, each applied to an argument ex ..."
Abstract
 Add to MetaCart
Abstract. Expansion generalises substitution. An expansion is a special term whose leaves can be substitutions. Substitutions map term variables to ordinary terms and expansion variables to expansions. Expansions (resp., ordinary terms) may contain expansion variables, each applied to an argument expansion (resp., ordinary term). Instances of the unification problem in this setting are constraint sets, where constraints are pairs of ordinary terms, and unifiers are expansions. This problem offers many interesting challenges. The theory of unification with expansion variables was first considered in relation to the study of systems of intersection types for the λcalculus. Solving constraint sets, under appropriate conditions, corresponds to type inference for lambdaterms in these systems. We explain expansions and present a simple rewrite system for unification with expansion variables where ordinary terms uses the intersection type constructors. The simple rewrite system lacks some important properties. We indicate how it can be adapted to: simulate βreduction, and intersection typing, of λterms; be a complete semidecision procedure for unification; be confluent; produce mostgeneral unifiers. Every constraint set has a trivial unifier. However, finding a single mostgeneral unifier is often impossible. We study the concept of mostgeneral unifiers and introduce principal unifiers, which are easier to construct. Mostgeneral unifiers exist for the unification problem formed by a certain restriction of substitutions, and we give an incomplete variant of simple unification to that finds them. A second variant system addresses completeness and principality, producing covering substitutionunifier sets for constraints (every substitutionunifier is an instance of a set member, and all expansionunifiers can be obtained from the set). For covering unifier sets we modify the problem to a form of Eunification where the constant ω is the unit of the intersection constructor. 1