Results 1 
4 of
4
Monadic Presentations of Lambda Terms Using Generalized Inductive Types
 In Computer Science Logic
, 1999
"... . We present a denition of untyped terms using a heterogeneous datatype, i.e. an inductively dened operator. This operator can be extended to a Kleisli triple, which is a concise way to verify the substitution laws for calculus. We also observe that repetitions in the denition of the monad as wel ..."
Abstract

Cited by 94 (18 self)
 Add to MetaCart
(Show Context)
. We present a denition of untyped terms using a heterogeneous datatype, i.e. an inductively dened operator. This operator can be extended to a Kleisli triple, which is a concise way to verify the substitution laws for calculus. We also observe that repetitions in the denition of the monad as well as in the proofs can be avoided by using wellfounded recursion and induction instead of structural induction. We extend the construction to the simply typed calculus using dependent types, and show that this is an instance of a generalization of Kleisli triples. The proofs for the untyped case have been checked using the LEGO system. Keywords. Type Theory, inductive types, calculus, category theory. 1 Introduction The metatheory of substitution for calculi is interesting maybe because it seems intuitively obvious but becomes quite intricate if we take a closer look. [Hue92] states seven formal properties of substitution which are then used to prove a general substitution theor...
Typeinference based short cut deforestation (nearly) without inlining
 In IFL'99, Lochem, The Netherlands, Proceedings
, 1999
"... Deforestation optimises a functional program by transforming it into another one that does not create certain intermediate data structures. In [Chi99] we presented a typeinference based deforestation algorithm which performs extensive inlining. However, across module boundaries only limited inlinin ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Deforestation optimises a functional program by transforming it into another one that does not create certain intermediate data structures. In [Chi99] we presented a typeinference based deforestation algorithm which performs extensive inlining. However, across module boundaries only limited inlining is practically feasible. Furthermore, inlining is a nontrivial transformation which is therefore best implemented as a separate optimisation pass. To perform short cut deforestation (nearly) without inlining, Gill suggested to split definitions into workers and wrappers and inline only the small wrappers, which transfer the information needed for deforestation. We show that Gill’s use of a function build limits deforestation and note that his reasons for using build do not apply to our approach. Hence we develop a more general worker/wrapper scheme without build. We give a typeinference based algorithm which splits definitions into workers and wrappers. Finally, we show that we can deforest more expressions with the worker/wrapper scheme than the algorithm with inlining. 1 TypeInferenceBased Short Cut Deforestation In lazy functional programs two functions are often glued together by an intermediate data structure that is produced by one function and consumed by the other. For example, the function any, which tests whether any element of a list xs satisfies a given predicate p, may be defined as follows in Haskell [PH + 99]: any p xs = or (map p xs) The function map applies p to all elements of xs yielding a list of boolean values. The function or combines these boolean values with the logical or operation (). Although lazy evaluation makes this modular programming style practicable [Hug89], it does not come for free. Each list cell has to be allocated, filled, taken apart and finally garbage collected. The following monolithic definition of any is more efficient. any p [] = False any p (x:xs) = p x   any p xs It is the aim of deforestation algorithms to automatically transform a functional program into another one that does not create such intermediate data structures. We say that the producer and the consumer of the data structure are fused.
Deriving a Complete Type Inference for HindleyMilner and Vector Sizes using Expansion
"... Type inference and program analysis both infer static properties about a program. Yet, they are constructed using very different techniques. We reconcile both approaches by deriving a type inference from a denotational semantics using abstract interpretation. We observe that completeness results in ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Type inference and program analysis both infer static properties about a program. Yet, they are constructed using very different techniques. We reconcile both approaches by deriving a type inference from a denotational semantics using abstract interpretation. We observe that completeness results in the abstract interpretation literature can be used to derive type inferences that are backward complete, a property akin to the inference of principal typings. The resulting algorithm is similar to that of MilnerMycroft, that is, it infers HindleyMilner types while allowing for polymorphic recursion. Instead of type schemes, it uses expansion to instantiate types. Since our expansion operator is agnostic to the abstract domain, we are able to apply it not only to types. We illustrate this by inferring the size of vector types using systems of linear equalities.
Type Preorders and Recursive Terms
, 2004
"... We show how to use intersection types for building models of a #calculus enriched with recursive terms, whose intended meaning is of minimal fixed points. As a byproduct we prove an interesting consistency result. ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We show how to use intersection types for building models of a #calculus enriched with recursive terms, whose intended meaning is of minimal fixed points. As a byproduct we prove an interesting consistency result.