Results 1  10
of
26
Normalization by evaluation for typed lambda calculus with coproducts
 In LICS
, 2001
"... We solve the decision problem for simply typed lambda calculus with strong binary sums, equivalently the word problem for free cartesian closed categories with binary coproducts. Our method is based on the semantical technique known as “normalization by evaluation ” and involves inverting the interp ..."
Abstract

Cited by 39 (5 self)
 Add to MetaCart
We solve the decision problem for simply typed lambda calculus with strong binary sums, equivalently the word problem for free cartesian closed categories with binary coproducts. Our method is based on the semantical technique known as “normalization by evaluation ” and involves inverting the interpretation of the syntax into a suitable sheaf model and from this extracting appropriate unique normal forms. There is no rewriting theory involved, and the proof is completely constructive, allowing program extraction from the proof. 1
Adjoint Rewriting
, 1995
"... This thesis concerns rewriting in the typed calculus. Traditional categorical models of typed calculus use concepts such as functor, adjunction and algebra to model type constructors and their associated introduction and elimination rules, with the natural categorical equations inherent in these s ..."
Abstract

Cited by 25 (11 self)
 Add to MetaCart
This thesis concerns rewriting in the typed calculus. Traditional categorical models of typed calculus use concepts such as functor, adjunction and algebra to model type constructors and their associated introduction and elimination rules, with the natural categorical equations inherent in these structures providing an equational theory for terms. One then seeks a rewrite relation which, by transforming terms into canonical forms, provides a decision procedure for this equational theory. Unfortunately the rewrite relations which have been proposed, apart from for the most simple of calculi, either generate the full equational theory but contain no decision procedure, or contain a decision procedure but only for a subtheory of that required. Our proposal is to unify the semantics and reduction theory of the typed calculus by generalising the notion of model from categorical structures based on term equality to categorical structures based on term reduction. This is accomplished via...
Categorical Reconstruction of a Reduction Free Normalization Proof
, 1995
"... Introduction We present a categorical proof of the normalization theorem for simply typed calculus, i.e. we derive a computable function nf which assigns to every typed term a normal form, s.t. M ' N nf(M ) = nf(N ) nf(M ) ' M where ' is fij equality. Both the function nf and i ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
Introduction We present a categorical proof of the normalization theorem for simply typed calculus, i.e. we derive a computable function nf which assigns to every typed term a normal form, s.t. M ' N nf(M ) = nf(N ) nf(M ) ' M where ' is fij equality. Both the function nf and its correctness properties can be deduced from the categorical construction. To substantiate this, we present an ML program in the appendix which can be extracted from our argument. We emphasize that this presentation of normalization is reduction free, i.e. we do not mention term rewriting or use properties of term rewriting systems such as the ChurchRosser property. An immediate consequence of normalization is the decidability of ' but there are other useful corollaries; for instance we can show that
Shape Checking of Array Programs
 In Computing: the Australasian Theory Seminar, Proceedings
, 1997
"... Shape theory provides a framework for the study of data types in which shape and data can be manipulated separately. This paper is concerned with shape checking, i.e. the detection of shape errors, such as array bound errors, without handling the data stored within. It can be seen as a form of parti ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
Shape theory provides a framework for the study of data types in which shape and data can be manipulated separately. This paper is concerned with shape checking, i.e. the detection of shape errors, such as array bound errors, without handling the data stored within. It can be seen as a form of partial evaluation in which data computations are ignored. We construct a simplytyped lambdacalculus that supports a vector type constructor, whose iteration yields types of arrays. It is expressive enough to construct all of the usual linear algebra operations. All shape errors in a term t can be detected by evaluating its shape #t. Evaluation of #t will terminate if that of t does. Keywords shape analysis, partial evaluation, arrays, higherorder. 1 Introduction Shape theory explores the consequences of manipulating shape and data separately (Jay [14]). Shape refers to the data structure in which the data is stored. For example, the shape of a threedimensional regular array is a tuple of...
EtaExpansions in Dependent Type Theory  The Calculus of Constructions
 Proceedings of the Third International Conference on Typed Lambda Calculus and Applications (TLCA'97
, 1997
"... . Although the use of expansionary jrewrite has become increasingly common in recent years, one area where jcontractions have until now remained the only possibility is in the more powerful type theories of the cube. This paper rectifies this situation by applying jexpansions to the Calculus of ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
. Although the use of expansionary jrewrite has become increasingly common in recent years, one area where jcontractions have until now remained the only possibility is in the more powerful type theories of the cube. This paper rectifies this situation by applying jexpansions to the Calculus of Constructions  we discuss some of the difficulties posed by the presence of dependent types, prove that every term rewrites to a unique long fijnormal form and deduce the decidability of fijequality, typeability and type inhabitation as corollaries. 1 Introduction Extensional equality for the simply typed calculus requires jconversion, whose interpretation as a rewrite rule has traditionally been as a contraction x : T:fx ) f where x 6 2 FV(t). When combined with the usual fireduction, the resulting rewrite relation is strongly normalising and confluent, and thus reduction to normal form provides a decision procedure for the associated equational theory. However jcontractions beh...
Categorical Term Rewriting: Monads and Modularity
 University of Edinburgh
, 1998
"... Term rewriting systems are widely used throughout computer science as they provide an abstract model of computation while retaining a comparatively simple syntax and semantics. In order to reason within large term rewriting systems, structuring operations are used to build large term rewriting syste ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
Term rewriting systems are widely used throughout computer science as they provide an abstract model of computation while retaining a comparatively simple syntax and semantics. In order to reason within large term rewriting systems, structuring operations are used to build large term rewriting systems from smaller ones. Of particular interest is whether key properties are modular, thatis,ifthe components of a structured term rewriting system satisfy a property, then does the term rewriting system as a whole? A body of literature addresses this problem, but most of the results and proofs depend on strong syntactic conditions and do not easily generalize. Although many specific modularity results are known, a coherent framework which explains the underlying principles behind these results is lacking. This thesis posits that part of the problem is the usual, concrete and syntaxoriented semantics of term rewriting systems, and that a semantics is needed which on the one hand elides unnecessary syntactic details but on the other hand still possesses enough expressive power to model the key concepts arising from
Linear Explicit Substitutions
 In Proc. of Westapp'98
, 1998
"... The oecalculus adds explicit substitutions to the calculus so as to provide a theoretical framework within which the implementation of functional programming languages can be studied. This paper generalises the oecalculus to provide a linear calculus of explicit substitutions, called xDILL, which ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
The oecalculus adds explicit substitutions to the calculus so as to provide a theoretical framework within which the implementation of functional programming languages can be studied. This paper generalises the oecalculus to provide a linear calculus of explicit substitutions, called xDILL, which analogously describes the implementation of linear functional programming languages. Our main observation is that there are nontrivial interactions between linearity and explicit substitutions and that xDILL is therefore best understood as a synthesis of its underlying logical structure and the technology of explicit substitutions. This is in contrast to the oecalculus where the explicit substitutions are independent of the underlying logical structure. Keywords: calculus, explicit substitutions, linear logic 1 Introduction This paper combines the technologies of explicit substitutions and linearity in a mathematically consistent way. We start by describing these technologies and the...
Monotone Inductive and Coinductive Constructors of Rank 2
 Proceedings of CSL 2001
, 2001
"... A generalization of positive inductive and coinductive types to monotone inductive and coinductive constructors of rank 1 and rank 2 is described. The motivation is taken from initial algebras and nal coalgebras in a functor category and the CurryHowardcorrespondence. The denition of the system as ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
A generalization of positive inductive and coinductive types to monotone inductive and coinductive constructors of rank 1 and rank 2 is described. The motivation is taken from initial algebras and nal coalgebras in a functor category and the CurryHowardcorrespondence. The denition of the system as a calculus requires an appropriate denition of monotonicity to overcome subtle problems, most notably to ensure that the (co)inductive constructors introduced via monotonicity of the underlying constructor of rank 2 are also monotone as constructors of rank 1. The problem is solved, strong normalization shown, and the notion proven to be wide enough to cover even highly complex datatypes. 1
On the Power of Simple Diagrams
, 1996
"... . In this paper we focus on a set of abstract lemmas that are easy to apply and turn out to be quite valuable in order to establish confluence and/or normalization modularly, especially when adding rewriting rules for extensional equalities to various calculi. We show the usefulness of the lemmas by ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
. In this paper we focus on a set of abstract lemmas that are easy to apply and turn out to be quite valuable in order to establish confluence and/or normalization modularly, especially when adding rewriting rules for extensional equalities to various calculi. We show the usefulness of the lemmas by applying them to various systems, ranging from simply typed lambda calculus to higher order lambda calculi, for which we can establish systematically confluence and/or normalization (or decidability of equality) in a simple way. Many result are new, but we also discuss systems for which our technique allows to provide a much simpler proof than what can be found in the literature. 1 Introduction During a recent investigation of confluence and normalization properties of polymorphic lambda calculus with an expansive version of the # rule, we came across a nice lemma that gives a simple but quite powerful sufficient condition to check the Church Rosser property for a compound rewriting system...
Normalization by evaluation for λ →2
 In Functional and Logic Programming, number 2998 in LNCS
, 2004
"... Abstract. We show that the settheoretic semantics for λ →2 is complete by inverting evaluation using decision trees. This leads to an implementation of normalization by evaluation which is witnessed by the source of part of this paper being a literate Haskell script. We show the correctness of our ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
Abstract. We show that the settheoretic semantics for λ →2 is complete by inverting evaluation using decision trees. This leads to an implementation of normalization by evaluation which is witnessed by the source of part of this paper being a literate Haskell script. We show the correctness of our implementation using logical relations. 1