Results 1  10
of
10
From formal proofs to mathematical proofs: A safe, incremental way for building in firstorder decision procedures
 In TCS 2008: 5th IFIP International Conference on Theoretical Computer Science
, 2008
"... (CIC) on which the proof assistant Coq is based: the Calculus of Congruent Inductive Constructions, which truly extends CIC by building in arbitrary firstorder decision procedures: deduction is still in charge of the CIC kernel, while computation is outsourced to dedicated firstorder decision proc ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(CIC) on which the proof assistant Coq is based: the Calculus of Congruent Inductive Constructions, which truly extends CIC by building in arbitrary firstorder decision procedures: deduction is still in charge of the CIC kernel, while computation is outsourced to dedicated firstorder decision procedures that can be taken from the shelves provided they deliver a proof certificate. The soundness of the whole system becomes an incremental property following from the soundness of the certificate checkers and that of the kernel. A detailed example shows that the resulting style of proofs becomes closer to that of the working mathematician. 1
Coq Modulo Theory
, 2010
"... Abstract. Coq Modulo Theory (CoqMT) is an extension of the Coq proof assistant incorporating, in its computational mechanism, validity entailment for userdefined firstorder equational theories. Such a mechanism strictly enriches the system (more terms are typable), eases the use of dependent types ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Abstract. Coq Modulo Theory (CoqMT) is an extension of the Coq proof assistant incorporating, in its computational mechanism, validity entailment for userdefined firstorder equational theories. Such a mechanism strictly enriches the system (more terms are typable), eases the use of dependent types and provides more automation during the development of proofs. CoqMT improves over the Calculus of Congruent Inductive Constructions by getting rid of various restrictions and simplifying the typechecking algorithm and the integration of firstorder decision procedures. We present here CoqMT, and outline its metatheoretical study. We also give a brief description of our CoqMT implementation. 1
Typed Applicative Structures and Normalization by Evaluation for System F ω
"... Abstract. We present a normalizationbyevaluation (NbE) algorithm for System F ω with βηequality, the simplest impredicative type theory with computation on the type level. Values are kept abstract and requirements on values are kept to a minimum, allowing many different implementations of the alg ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract. We present a normalizationbyevaluation (NbE) algorithm for System F ω with βηequality, the simplest impredicative type theory with computation on the type level. Values are kept abstract and requirements on values are kept to a minimum, allowing many different implementations of the algorithm. The algorithm is verified through a general model construction using typed applicative structures, called type and object structures. Both soundness and completeness of NbE are conceived as an instance of a single fundamental theorem.
Strategic Computation and Deduction
, 2009
"... I'd like to conclude by emphasizing what a wonderful eld this is to work in. Logical reasoning plays such a fundamental role in the spectrum of intellectual activities that advances in automating logic will inevitably have a profound impact in many intellectual disciplines. Of course, these things t ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
I'd like to conclude by emphasizing what a wonderful eld this is to work in. Logical reasoning plays such a fundamental role in the spectrum of intellectual activities that advances in automating logic will inevitably have a profound impact in many intellectual disciplines. Of course, these things take time. We tend to be impatient, but we need some historical perspective. The study of logic has a very long history, going back at least as far as Aristotle. During some of this time not very much progress was made. It's gratifying to realize how much has been accomplished in the less than fty years since serious e orts to mechanize logic began.
Towards Rewriting in Coq
"... Equational reasoning in Coq is not straightforward. For a few years now there has been an ongoing research process towards adding rewriting to Coq. However, there are many research problems on this way. In this paper we give a coherent view of rewriting in Coq, we describe what is already done and w ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Equational reasoning in Coq is not straightforward. For a few years now there has been an ongoing research process towards adding rewriting to Coq. However, there are many research problems on this way. In this paper we give a coherent view of rewriting in Coq, we describe what is already done and what remains to be done. We discuss such issues as strong normalization, confluence, logical consistency, completeness, modularity and extraction.
HighLevel Theories ⋆
, 2008
"... Abstract. We introduce highlevel theories in analogy with highlevel programming languages. The basic point is that even though one can define many theories via simple, lowlevel axiomatizations, that is neither an effective nor a comfortable way to work with such theories. We present an approach w ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. We introduce highlevel theories in analogy with highlevel programming languages. The basic point is that even though one can define many theories via simple, lowlevel axiomatizations, that is neither an effective nor a comfortable way to work with such theories. We present an approach which is closer to what users of mathematics employ, while still being based on formal structures. 1
Weak βηnormalization and normalization by evaluation for System F
 In LPAR’08, volume 5330 of LNAI
, 2008
"... Abstract. A general version of the fundamental theorem for System F is presented which can be instantiated to obtain proofs of weak β and βηnormalization and normalization by evaluation. 1 Introduction and Related Work Dependently typed lambdacalculi have been successfully used as proof languages ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract. A general version of the fundamental theorem for System F is presented which can be instantiated to obtain proofs of weak β and βηnormalization and normalization by evaluation. 1 Introduction and Related Work Dependently typed lambdacalculi have been successfully used as proof languages in proof assistants like Agda [Nor07], Coq [INR07], LEGO [Pol94], and NuPrl [Ct86]. Since types may depend on values in these type theories, checking equality of types, which is crucial for type and, thus, proof checking, is nontrivial for these
Type Structures and Normalization by Evaluation for System F ω
"... We present the first verified normalizationbyevaluation algorithm for System F ω, the simplest impredicative type theory with computation on the type level. Types appear in three shapes: As syntactical types, as type values which direct the reification process, and as semantical types, i.e., sets ..."
Abstract
 Add to MetaCart
We present the first verified normalizationbyevaluation algorithm for System F ω, the simplest impredicative type theory with computation on the type level. Types appear in three shapes: As syntactical types, as type values which direct the reification process, and as semantical types, i.e., sets of total values. The three shapes are captured by the new concept of a type structure, and the fundamental theorem now states that an induced structure is a type substructure. This work is an attempt at an algebraic treatment of type theory based on typed applicative structures rather than categories. 1
VeriML: A dependentlytyped, userextensible and languagecentric approach to proof assistants
, 2013
"... Software certification is a promising approach to producing programs which are virtually free of bugs. It requires the construction of a formal proof which establishes that the code in question will behave according to its specification – a higherlevel description of its functionality. The construc ..."
Abstract
 Add to MetaCart
Software certification is a promising approach to producing programs which are virtually free of bugs. It requires the construction of a formal proof which establishes that the code in question will behave according to its specification – a higherlevel description of its functionality. The construction of such formal proofs is carried out in tools called proof assistants. Advances in the current stateoftheart proof assistants have enabled the certification of a number of complex and realistic systems software. Despite such success stories, largescale proof development is an arcane art that requires significant manual effort and is extremely timeconsuming. The widely accepted best practice for limiting this effort is to develop domainspecific automation procedures to handle all but the most essential steps of proofs. Yet this practice is rarely followed or needs comparable development effort as well. This is due to a profound architectural shortcoming of existing proof assistants: developing automation procedures is currently overly complicated and errorprone. It involves the use of an amalgam of extension languages, each with a different programming model and a set of limitations, and with significant interfacing problems between them. This thesis posits that this situation can be significantly improved by designing a proof assistant with extensibility as the central focus. Towards that effect, I have designed a novel programming language called