Results 1 
7 of
7
Strategic Computation and Deduction
, 2009
"... I'd like to conclude by emphasizing what a wonderful eld this is to work in. Logical reasoning plays such a fundamental role in the spectrum of intellectual activities that advances in automating logic will inevitably have a profound impact in many intellectual disciplines. Of course, these things t ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
I'd like to conclude by emphasizing what a wonderful eld this is to work in. Logical reasoning plays such a fundamental role in the spectrum of intellectual activities that advances in automating logic will inevitably have a profound impact in many intellectual disciplines. Of course, these things take time. We tend to be impatient, but we need some historical perspective. The study of logic has a very long history, going back at least as far as Aristotle. During some of this time not very much progress was made. It's gratifying to realize how much has been accomplished in the less than fty years since serious e orts to mechanize logic began.
A sequent calculus for type theory
 CSL 2006. LNCS
, 2006
"... Based on natural deduction, Pure Type Systems (PTS) can express a wide range of type theories. In order to express proofsearch in such theories, we introduce the Pure Type Sequent Calculi (PTSC) by enriching a sequent calculus due to Herbelin, adapted to proofsearch and strongly related to natural ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Based on natural deduction, Pure Type Systems (PTS) can express a wide range of type theories. In order to express proofsearch in such theories, we introduce the Pure Type Sequent Calculi (PTSC) by enriching a sequent calculus due to Herbelin, adapted to proofsearch and strongly related to natural deduction. PTSC are equipped with a normalisation procedure, adapted from Herbelin’s and defined by local rewrite rules as in Cutelimination, using explicit substitutions. It satisfies Subject Reduction and it is confluent. A PTSC is logically equivalent to its corresponding PTS, and the former is strongly normalising if and only if the latter is. We show how the conversion rules can be incorporated inside logical rules (as in syntaxdirected rules for type checking), so that basic proofsearch tactics in type theory are merely the rootfirst application of our inference rules.
Explicit Substitutions and All That
, 2000
"... Explicit substitution calculi are extensions of the lambdacalculus where the substitution mechanism is internalized into the theory. This feature makes them suitable for implementation and theoretical study of logic based tools as strongly typed programming languages and proof assistant systems. In ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Explicit substitution calculi are extensions of the lambdacalculus where the substitution mechanism is internalized into the theory. This feature makes them suitable for implementation and theoretical study of logic based tools as strongly typed programming languages and proof assistant systems. In this paper we explore new developments on two of the most successful styles of explicit substitution calculi: the lambdasigma and lambda_secalculi.
More On Implicit Syntax
 In Automated Reasoning. First International Joint Conference (IJCAR'01
"... Proof assistants based on type theories, such as Coq and Lego, allow users to omit subterms on input that can be inferred automatically. While those mechanisms are well known, adhoc algorithms are used to suppress subterms on output. As a result, terms might be printed identically although they di ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Proof assistants based on type theories, such as Coq and Lego, allow users to omit subterms on input that can be inferred automatically. While those mechanisms are well known, adhoc algorithms are used to suppress subterms on output. As a result, terms might be printed identically although they di#er in hidden parts. Such ambiguous representations may confuse users. Additionally, terms might be rejected by the type checker because the printer has erased too much type information. This paper addresses these problems by proposing e#ective erasure methods that guarantee successful term reconstruction, similar to the ones developed for the compression of proofterms in ProofCarrying Code environments. Experiences with the implementation in Typelab proved them both e#cient and practical. 1 Implicit Syntax Type theories are powerful formal systems that capture both the notion of computation and deduction. Particularly the expressive theories, such as the Calculus of Constructions (CC) [CH88] which is investigated in this paper, are used for the development of mathematical and algorithmic theories since proofs and specifications are representable in a very direct way using one uniform language. There is a price to pay for this expressiveness: abstractions have to be decorated with annotations, and type applications have to be written explicitly, because type abstraction and type application are just special cases of #abstraction and application. For example, to form a list one has to provide the element type as an additional argument to instantiate the polymorphic constructors cons and nil as in (cons IN1(nil IN))). Also, one has to annotate the abstraction of n in #n:IN .n+1 with its type IN although this type is determined by the abstraction body. These excessive ...
A sequent calculus for type theory
 CSL 2006. LNCS
, 2006
"... Abstract Based on natural deduction, Pure Type Systems (PTS) can express a wide range of type theories. In order to express proofsearch in such theories, we introduce the Pure Type Sequent Calculi (PTSC) by enriching a sequent calculus due to Herbelin, adapted to proofsearch and strongly related t ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract Based on natural deduction, Pure Type Systems (PTS) can express a wide range of type theories. In order to express proofsearch in such theories, we introduce the Pure Type Sequent Calculi (PTSC) by enriching a sequent calculus due to Herbelin, adapted to proofsearch and strongly related to natural deduction. PTSC are equipped with a normalisation procedure, adapted from Herbelin’s and defined by local rewrite rules as in Cutelimination, using explicit substitutions. It satisfies Subject Reduction and it is confluent. A PTSC is logically equivalent to its corresponding PTS, and the former is strongly normalising if and only if the latter is. We show how the conversion rules can be incorporated inside logical rules (as in syntaxdirected rules for type checking), so that basic proofsearch tactics in type theory are merely the rootfirst application of our inference rules.
Unification via the ...Style of Explicit Substitutions
, 2001
"... A unication method based on the se style of explicit substitution is proposed. This method together with appropriate translations, provide a Higher Order Unication (HOU) procedure for the pure calculus. Our method is inuenced by the treatment introduced by Dowek, Hardin and Kirchner using the sty ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
A unication method based on the se style of explicit substitution is proposed. This method together with appropriate translations, provide a Higher Order Unication (HOU) procedure for the pure calculus. Our method is inuenced by the treatment introduced by Dowek, Hardin and Kirchner using the style of explicit substitution. Correctness and completeness properties of the proposed seunication method are shown and its advantages, inherited from the qualities of the se calculus, are pointed out. Our method needs only one sort of objects: terms. And in contrast to the HOU approach based on the calculus, it avoids the use of substitution objects. This makes our method closer to the syntax of the calculus. Furthermore, detection of redices depends on the search for solutions of simple arithmetic constraints which makes our method more operational than the one based on the style of explicit substitution. Keywords: Higher order unication, explicit substitution, lambdacalculi. 1
Type checking in the presence of metavariables
"... In this paper we present a type checking algorithm for a dependently typed logical framework extended with metavariables. It is common for such frameworks to accept that unification creates substitutions that are not well typed [4, 6, 16], but we give a novel approach to the treatment of metavari ..."
Abstract
 Add to MetaCart
In this paper we present a type checking algorithm for a dependently typed logical framework extended with metavariables. It is common for such frameworks to accept that unification creates substitutions that are not well typed [4, 6, 16], but we give a novel approach to the treatment of metavariables where welltypedness of substitutions is guaranteed. To ensure type correctness the type checker creates an optimal welltyped approximation of the term being type checked. We use a restricted form of pattern unification, but we believe that the results carry over to other unification algorithms. We prove that the algorithm is sound and terminating. The proposed algorithm has been implemented with promising results.