Results 1  10
of
19
Compiling polymorphism using intensional type analysis
 In Symposium on Principles of Programming Languages
, 1995
"... The views and conclusions contained in this document are those of the authors and should not be interpreted as ..."
Abstract

Cited by 260 (18 self)
 Add to MetaCart
The views and conclusions contained in this document are those of the authors and should not be interpreted as
Compiling with Types
, 1995
"... Compilers for monomorphic languages, such as C and Pascal, take advantage of types to determine data representations, alignment, calling conventions, and register selection. However, these languages lack important features including polymorphism, abstract datatypes, and garbage collection. In contr ..."
Abstract

Cited by 102 (14 self)
 Add to MetaCart
Compilers for monomorphic languages, such as C and Pascal, take advantage of types to determine data representations, alignment, calling conventions, and register selection. However, these languages lack important features including polymorphism, abstract datatypes, and garbage collection. In contrast, modern programming languages such as Standard ML (SML), provide all of these features, but existing implementations fail to take full advantage of types. The result is that performance of SML code is quite bad when compared to C. In this thesis, I provide a general framework, called typedirected compilation, that allows compiler writers to take advantage of types at all stages in compilation. In the framework, types are used not only to determine efficient representations and calling conventions, but also to prove the correctness of the compiler. A key property of typedirected compilation is that all but the lowest levels of the compiler use typed intermediate languages. An advantage of this approach is that it provides a means for automatically checking the integrity of the resulting code. An important
Intuitionistic Model Constructions and Normalization Proofs
, 1998
"... We investigate semantical normalization proofs for typed combinatory logic and weak calculus. One builds a model and a function `quote' which inverts the interpretation function. A normalization function is then obtained by composing quote with the interpretation function. Our models are just like ..."
Abstract

Cited by 44 (7 self)
 Add to MetaCart
We investigate semantical normalization proofs for typed combinatory logic and weak calculus. One builds a model and a function `quote' which inverts the interpretation function. A normalization function is then obtained by composing quote with the interpretation function. Our models are just like the intended model, except that the function space includes a syntactic component as well as a semantic one. We call this a `glued' model because of its similarity with the glueing construction in category theory. Other basic type constructors are interpreted as in the intended model. In this way we can also treat inductively defined types such as natural numbers and Brouwer ordinals. We also discuss how to formalize terms, and show how one model construction can be used to yield normalization proofs for two different typed calculi  one with explicit and one with implicit substitution. The proofs are formalized using MartinLof's type theory as a meta language and mechanized using the A...
Normalization and the Yoneda Embedding
"... this paper we describe a new, categorical approach to normalization in typed  ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
this paper we describe a new, categorical approach to normalization in typed 
Normalization by evaluation for MartinLöf type theory with one universe
 IN 23RD CONFERENCE ON THE MATHEMATICAL FOUNDATIONS OF PROGRAMMING SEMANTICS, MFPS XXIII, ELECTRONIC NOTES IN THEORETICAL COMPUTER SCIENCE
, 2007
"... ..."
Proof Transformations in HigherOrder Logic
, 1987
"... We investigate the problem of translating between different styles of proof systems in higherorder logic: analytic proofs which are well suited for automated theorem proving, and nonanalytic deductions which are well suited for the mathematician. Analytic proofs are represented as expansion proofs, ..."
Abstract

Cited by 21 (5 self)
 Add to MetaCart
We investigate the problem of translating between different styles of proof systems in higherorder logic: analytic proofs which are well suited for automated theorem proving, and nonanalytic deductions which are well suited for the mathematician. Analytic proofs are represented as expansion proofs, H, a form of the sequent calculus we define, nonanalytic proofs are represented by natural deductions. A nondeterministic translation algorithm between expansion proofs and Hdeductions is presented and its correctness is proven. We also present an algorithm for translation in the other direction and prove its correctness. A cutelimination algorithm for expansion proofs is given and its partial correctness is proven. Strong termination of this algorithm remains a conjecture for the full higherorder system, but is proven for the firstorder fragment. We extend the translations to a nonanalytic proof system which contains a primitive notion of equality, while leaving the notion of expansion proof unaltered. This is possible, since a nonextensional equality is definable in our system of type theory. Next we extend analytic and nonanalytic proof systems and the translations between them to include extensionality. Finally, we show how the methods and notions used so far apply to the problem of translating expansion proofs into natural deductions. Much care is taken to specify this translation in a
From ReductionBased to ReductionFree Normalization
, 2004
"... We present a systematic construction of a reductionfree normalization function. Starting from ..."
Abstract

Cited by 21 (8 self)
 Add to MetaCart
We present a systematic construction of a reductionfree normalization function. Starting from
Verifying a Semantic βηConversion Test for MartinLöf Type Theory
, 2008
"... Typechecking algorithms for dependent type theories often rely on the interpretation of terms in some semantic domain of values when checking equalities. Here we analyze a version of Coquand’s algorithm for checking the βηequality of such semantic values in a theory with a predicative universe hi ..."
Abstract

Cited by 12 (9 self)
 Add to MetaCart
Typechecking algorithms for dependent type theories often rely on the interpretation of terms in some semantic domain of values when checking equalities. Here we analyze a version of Coquand’s algorithm for checking the βηequality of such semantic values in a theory with a predicative universe hierarchy and large elimination rules. Although this algorithm does not rely on normalization by evaluation explicitly, we show that similar ideas can be employed for its verification. In particular, our proof uses the new notions of contextual reification and strong semantic equality. The algorithm is part of a bidirectional type checking algorithm which checks whether a normal term has a certain semantic type, a technique notion of semantic domain in order to accommodate a variety of possible implementation techniques, such as normal forms, weak head normal forms, closures, and compiled code. Our aim is to get closer than previous work to verifying the typechecking algorithms which are actually used in practice.