Results 1  10
of
25
Automating the Meta Theory of Deductive Systems
, 2000
"... not be interpreted as representing the o cial policies, either expressed or implied, of NSF or the U.S. Government. This thesis describes the design of a metalogical framework that supports the representation and veri cation of deductive systems, its implementation as an automated theorem prover, a ..."
Abstract

Cited by 80 (16 self)
 Add to MetaCart
not be interpreted as representing the o cial policies, either expressed or implied, of NSF or the U.S. Government. This thesis describes the design of a metalogical framework that supports the representation and veri cation of deductive systems, its implementation as an automated theorem prover, and experimental results related to the areas of programming languages, type theory, and logics. Design: The metalogical framework extends the logical framework LF [HHP93] by a metalogic M + 2. This design is novel and unique since it allows higherorder encodings of deductive systems and induction principles to coexist. On the one hand, higherorder representation techniques lead to concise and direct encodings of programming languages and logic calculi. Inductive de nitions on the other hand allow the formalization of properties about deductive systems, such as the proof that an operational semantics preserves types or the proof that a logic is is a proof calculus whose proof terms are recursive functions that may be consistent.M +
Classical Logic and Computation
, 2000
"... This thesis contains a study of the proof theory of classical logic and addresses the problem of giving a computational interpretation to classical proofs. This interpretation aims to capture features of computation that go beyond what can be expressed in intuitionisticlogic. We introduce several ..."
Abstract

Cited by 59 (7 self)
 Add to MetaCart
This thesis contains a study of the proof theory of classical logic and addresses the problem of giving a computational interpretation to classical proofs. This interpretation aims to capture features of computation that go beyond what can be expressed in intuitionisticlogic. We introduce several strongly normalising cutelimination procedures for classicallogic. Our procedures are less restrictive than previous strongly normalising procedures, while at the same time retaining the strong normalisation property, which various standardcutelimination procedures lack. In order to apply proof techniques from term rewriting, including symmetric reducibility candidates and recursive path ordering, we develop termannotations for sequent proofs of classical logic. We then present a sequenceconclusion natural deduction calculus for classical logicand study the correspondence between cutelimination and normalisation. In contrast to earlier work, which analysed this correspondence in various fragments of intuitionisticlogic, we establish the correspondence in classical logic. Finally, we study applications of cutelimination. In particular, we analyse severalclassical proofs with respect to their behaviour under cutelimination. Because our cutelimination procedures impose fewer constraints than previous procedures, we are ableto show how a fragment of classical logic can be seen as a typing system for the simplytyped lambda calculus extended with an erratic choice operator. As a pleasing consequence, we can give a simple computational interpretation to Lafont's example.
Metaprogramming with Builtin Type Equality (Extended Abstract)
, 2004
"... Tim Sheard sheard@cse.ogi.edu Emir Pasalic + pasalic@cse.ogi.edu ABSTRACT We report our experience with exploring a new point in the design space for formal reasoning systems: the development of the programming language##ngu .##209 is intended as both a practical programming language and ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
Tim Sheard sheard@cse.ogi.edu Emir Pasalic + pasalic@cse.ogi.edu ABSTRACT We report our experience with exploring a new point in the design space for formal reasoning systems: the development of the programming language##ngu .##209 is intended as both a practical programming language and a logic. The main goal of##102 is to allow programmers to describe and reason about semantic properties of programs from within the programming language itself, mainly by using a powerful type system.
Bigraphical Semantics of HigherOrder Mobile Embedded Resources with Local Names
 Proceedings of the Graph Transformation for Verification and Concurrency workshop (GTVC'05)
, 2006
"... Bigraphs have been introduced with the aim to provide a topographical metamodel for mobile, distributed agents that can manipulate their own linkages and nested locations, generalising both characteristics of the πcalculus and the Mobile Ambients calculus. We give the first bigraphical presentatio ..."
Abstract

Cited by 17 (10 self)
 Add to MetaCart
Bigraphs have been introduced with the aim to provide a topographical metamodel for mobile, distributed agents that can manipulate their own linkages and nested locations, generalising both characteristics of the πcalculus and the Mobile Ambients calculus. We give the first bigraphical presentation of a nonlinear, higherorder process calculus with nested locations, nonlinear active process mobility, and local names, the calculus of HigherOrder Mobile Embedded Resources (Homer). The presentation is based on Milner’s recent presentation of the λcalculus in local bigraphs. The combination of nonlinear active process mobility and local names requires a new definition of parametric reaction rules and a representation of the location of names. We suggest localised bigraphs as a generalisation of local bigraphs in which links can be further localised. Key words: bigraphs, local names, nonlinear process mobility
Conservative extensions of the λcalculus for the computational interpretation of sequent calculus
, 2002
"... ..."
Strong normalisation for a gentzenlike cutelimination procedure
 In TLCA
, 2001
"... Abstract. In this paper we introduce a cutelimination procedure for classical logic, which is both strongly normalising and consisting of local proof transformations. Traditional cutelimination procedures, including the one by Gentzen, are formulated so that they only rewrite neighbouring inferenc ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Abstract. In this paper we introduce a cutelimination procedure for classical logic, which is both strongly normalising and consisting of local proof transformations. Traditional cutelimination procedures, including the one by Gentzen, are formulated so that they only rewrite neighbouring inference rules; that is they use local proof transformations. Unfortunately, such local proof transformation, if defined naïvely, break the strong normalisation property. Inspired by work of Bloo and Geuvers concerning the λxcalculus, we shall show that a simple trick allows us to preserve this property in our cutelimination procedure. We shall establish this property using the recursive path ordering by Dershowitz.
Implicit and Explicit Aspects of Scope and Block Structure
"... Block structure is a fundamental mechanism for expressing the scope of variables in a computation. Scope can be expressed either explicitly with formal parameters or implicitly with free variables. We discuss the transformations between explicit and implicit scope. Making scope explicit is current p ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Block structure is a fundamental mechanism for expressing the scope of variables in a computation. Scope can be expressed either explicitly with formal parameters or implicitly with free variables. We discuss the transformations between explicit and implicit scope. Making scope explicit is current practice in functional programming: it is called "lambdalifting. " Our thesis addresses the transformations between explicit and implicit scope. We show lambdadropping to be a useful transformation that can be applied to clarify the structure of programs and to increase the efficiency of recursive functions. In addition, we demonstrate that lambdadropping is of practical use as a backend in a partial evaluator. 1 Preface This document describes most of the work that was done in conjunction with the author's "speciale" (master's thesis). One of the primary subjects of this work is a program transformation called "lambdadropping." It was first conceived by my thesis advisor, Olivier Dan...
Strong normalisation of Herbelin's explicit substitution calculus with substitution propagation
"... . Herbelin presented (at CSL'94) a simple sequent calculus for minimal implicational logic, extensible to full rstorder intuitionistic logic, with a complete system of cutreduction rules which is both conuent and strongly normalising. Some of the cut rules may be regarded as rules to construc ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
. Herbelin presented (at CSL'94) a simple sequent calculus for minimal implicational logic, extensible to full rstorder intuitionistic logic, with a complete system of cutreduction rules which is both conuent and strongly normalising. Some of the cut rules may be regarded as rules to construct explicit substitutions. He observed that the addition of a cut permutation rule, for propagation of such substitutions, breaks the proof of strong normalisation; the implicit conjecture is that the rule may be added without breaking strong normalisation. We prove this conjecture, thus showing how to model betareduction in his calculus (extended with rules to allow cut permutations). 1 Introduction Herbelin gave in [5] a calculus for minimal implicational logic, using a notation for proof terms that, in contrast to the usual lambdacalculus notation for natural deduction, brings head variables to the surface. It is thus a sequent calculus, with the nice feature that its cutfree terms are in ...