Results 1 
5 of
5
Normalization by evaluation for MartinLöf type theory with one universe
 IN 23RD CONFERENCE ON THE MATHEMATICAL FOUNDATIONS OF PROGRAMMING SEMANTICS, MFPS XXIII, ELECTRONIC NOTES IN THEORETICAL COMPUTER SCIENCE
, 2007
"... ..."
Irrelevance in Type Theory with a Heterogeneous Equality Judgement
"... Abstract. Dependently typed programs contain an excessive amount of static terms which are necessary to please the type checker but irrelevant for computation. To obtain reasonable performance of not only the compiled program but also the type checker such static terms need to be erased as early as ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Abstract. Dependently typed programs contain an excessive amount of static terms which are necessary to please the type checker but irrelevant for computation. To obtain reasonable performance of not only the compiled program but also the type checker such static terms need to be erased as early as possible, preferably immediately after type checking. To this end, Pfenning’s type theory with irrelevant quantification, that models a distinction between static and dynamic code, is extended to universes and large eliminations. Novel is a heterogeneously typed implementation of equality which allows the smooth construction of a universal Kripke model that proves normalization, consistency and decidability.
Connecting a logical framework to a firstorder logic prover
 IN FROCOS’05: PROCEEDINGS OF THE 5TH INTERNATIONAL WORKSHOP ON FRONTIERS OF COMBINING SYSTEMS
, 2005
"... We present one way of combining a logical framework and firstorder logic. The logical framework is used as an interface to a firstorder theorem prover. Its main purpose is to keep track of the structure of the proof and to deal with the high level steps, for instance, induction. The steps that i ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We present one way of combining a logical framework and firstorder logic. The logical framework is used as an interface to a firstorder theorem prover. Its main purpose is to keep track of the structure of the proof and to deal with the high level steps, for instance, induction. The steps that involve purely propositional or simple firstorder reasoning are left to a firstorder resolution prover (the system Gandalf in our prototype). The correctness of this interaction is based on a general metatheoretic result. One feature is the simplicity of our translation between the logical framework and firstorder logic, which uses implicit typing. Implementation and case studies are described.
Explicit Substitutions for Contextual Type Theory
"... In this paper, we present an explicit substitution calculus which distinguishes between ordinary bound variables and metavariables. Its typing discipline is derived from contextual modal type theory. We first present a dependently typed lambda calculus with explicit substitutions for ordinary varia ..."
Abstract
 Add to MetaCart
In this paper, we present an explicit substitution calculus which distinguishes between ordinary bound variables and metavariables. Its typing discipline is derived from contextual modal type theory. We first present a dependently typed lambda calculus with explicit substitutions for ordinary variables and explicit metasubstitutions for metavariables. We then present a weak head normalization procedure which performs both substitutions lazily and in a single pass thereby combining substitution walks for the two different classes of variables. Finally, we describe a bidirectional type checking algorithm which uses weak head normalization and prove soundness.
ON IRRELEVANCE AND ALGORITHMIC EQUALITY IN PREDICATIVE TYPE THEORY
"... Abstract. Dependently typed programs contain an excessive amount of static terms which are necessary to please the type checker but irrelevant for computation. To obtain reasonable performance of not only the compiled program but also the type checker such static terms need to be erased as early as ..."
Abstract
 Add to MetaCart
Abstract. Dependently typed programs contain an excessive amount of static terms which are necessary to please the type checker but irrelevant for computation. To obtain reasonable performance of not only the compiled program but also the type checker such static terms need to be erased as early as possible, preferably immediately after type checking. To this end, Pfenning’s type theory with irrelevant quantification, that models a distinction between static and dynamic code, is extended to universes and large eliminations. Normalization, consistency, and decidability are obtained via a universal Kripke model based on algorithmic equality. 1. Introduction and Related