Results 1  10
of
22
Dependently Typed Functional Programs and their Proofs
, 1999
"... Research in dependent type theories [ML71a] has, in the past, concentrated on its use in the presentation of theorems and theoremproving. This thesis is concerned mainly with the exploitation of the computational aspects of type theory for programming, in a context where the properties of programs ..."
Abstract

Cited by 70 (13 self)
 Add to MetaCart
Research in dependent type theories [ML71a] has, in the past, concentrated on its use in the presentation of theorems and theoremproving. This thesis is concerned mainly with the exploitation of the computational aspects of type theory for programming, in a context where the properties of programs may readily be specified and established. In particular, it develops technology for programming with dependent inductive families of datatypes and proving those programs correct. It demonstrates the considerable advantage to be gained by indexing data structures with pertinent characteristic information whose soundness is ensured by typechecking, rather than human effort. Type theory traditionally presents safe and terminating computation on inductive datatypes by means of elimination rules which serve as induction principles and, via their associated reduction behaviour, recursion operators [Dyb91]. In the programming language arena, these appear somewhat cumbersome and give rise to unappealing code, complicated by the inevitable interaction between case analysis on dependent types and equational reasoning on their indices which must appear explicitly in the terms. Thierry Coquand’s proposal [Coq92] to equip type theory directly with the kind of
Inductive Data Type Systems
 THEORETICAL COMPUTER SCIENCE
, 1997
"... In a previous work (“Abstract Data Type Systems”, TCS 173(2), 1997), the last two authors presented a combined language made of a (strongly normalizing) algebraic rewrite system and a typed λcalculus enriched by patternmatching definitions following a certain format, called the “General Schema”, w ..."
Abstract

Cited by 44 (10 self)
 Add to MetaCart
In a previous work (“Abstract Data Type Systems”, TCS 173(2), 1997), the last two authors presented a combined language made of a (strongly normalizing) algebraic rewrite system and a typed λcalculus enriched by patternmatching definitions following a certain format, called the “General Schema”, which generalizes the usual recursor definitions for natural numbers and similar “basic inductive types”. This combined language was shown to be strongly normalizing. The purpose of this paper is to reformulate and extend the General Schema in order to make it easily extensible, to capture a more general class of inductive types, called “strictly positive”, and to ease the strong normalization proof of the resulting system. This result provides a computation model for the combination of an algebraic specification language based on abstract data types and of a strongly typed functional language with strictly positive inductive types.
Elimination with a Motive
 Types for Proofs and Programs (Proceedings of the International Workshop, TYPES’00), volume 2277 of LNCS
, 2002
"... I present a tactic, BasicElim, for Type Theory based proof systems to apply elimination rules in a refinement setting. Applicable rules are parametric in their conclusion, expressing the leverage hypotheses ~x yield on any \Phi ~x we choose. \Phi represents the motive for an elimination: BasicElim' ..."
Abstract

Cited by 37 (12 self)
 Add to MetaCart
I present a tactic, BasicElim, for Type Theory based proof systems to apply elimination rules in a refinement setting. Applicable rules are parametric in their conclusion, expressing the leverage hypotheses ~x yield on any \Phi ~x we choose. \Phi represents the motive for an elimination: BasicElim's job is to construct a \Phi suited to the goal at hand. If these ~x inhabit an instance of \Phi's domain, I adopt a technique standard in `folklore', generalizing the ~x and expressing the restriction by equation. A novel notion of = readily permits dependent equations, and a second tactic, Unify, simpifies the equational hypotheses thus appearing in subgoals. Given such technology, it becomes effective to express properties of datatypes, relations and functions in this style. A small extension couples BasicElim with rewriting, allowing complex techniques to be packaged in a single rule. 1
Building decision procedures in the calculus of inductive constructions
 of Lecture Notes in Computer Science
, 2007
"... It is commonly agreed that the success of future proof assistants will rely on their ability to incorporate computations within deduction in order to mimic the mathematician when replacing the proof of a proposition P by the proof of an equivalent proposition P ’ obtained from P thanks to possibly c ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
It is commonly agreed that the success of future proof assistants will rely on their ability to incorporate computations within deduction in order to mimic the mathematician when replacing the proof of a proposition P by the proof of an equivalent proposition P ’ obtained from P thanks to possibly complex calculations. In this paper, we investigate a new version of the calculus of inductive constructions which incorporates arbitrary decision procedures into deduction via the conversion rule of the calculus. The novelty of the problem in the context of the calculus of inductive constructions lies in the fact that the computation mechanism varies along proofchecking: goals are sent to the decision procedure together with the set of user hypotheses available from the current context. Our main result shows that this extension of the calculus of constructions does not compromise its main properties: confluence, subject reduction, strong normalization and consistency are all preserved.
From formal proofs to mathematical proofs: A safe, incremental way for building in firstorder decision procedures
 In TCS 2008: 5th IFIP International Conference on Theoretical Computer Science
, 2008
"... (CIC) on which the proof assistant Coq is based: the Calculus of Congruent Inductive Constructions, which truly extends CIC by building in arbitrary firstorder decision procedures: deduction is still in charge of the CIC kernel, while computation is outsourced to dedicated firstorder decision proc ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(CIC) on which the proof assistant Coq is based: the Calculus of Congruent Inductive Constructions, which truly extends CIC by building in arbitrary firstorder decision procedures: deduction is still in charge of the CIC kernel, while computation is outsourced to dedicated firstorder decision procedures that can be taken from the shelves provided they deliver a proof certificate. The soundness of the whole system becomes an incremental property following from the soundness of the certificate checkers and that of the kernel. A detailed example shows that the resulting style of proofs becomes closer to that of the working mathematician. 1
Semicontinuous sized types and termination
 In Zoltán Ésik, editor, Computer Science Logic, 20th International Workshop, CSL 2006, 15th Annual Conference of the EACSL
"... Abstract. Some typebased approaches to termination use sized types: an ordinal bound for the size of a data structure is stored in its type. A recursive function over a sized type is accepted if it is visible in the type system that recursive calls occur just at a smaller size. This approach is onl ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
Abstract. Some typebased approaches to termination use sized types: an ordinal bound for the size of a data structure is stored in its type. A recursive function over a sized type is accepted if it is visible in the type system that recursive calls occur just at a smaller size. This approach is only sound if the type of the recursive function is admissible, i.e., depends on the size index in a certain way. To explore the space of admissible functions in the presence of higherkinded data types and impredicative polymorphism, a semantics is developed where sized types are interpreted as functions from ordinals into sets of strongly normalizing terms. It is shown that upper semicontinuity of such functions is a sufficient semantic criterion for admissibility. To provide a syntactical criterion, a calculus for semicontinuous functions is developed. 1.
A Unifying Approach to Recursive and Corecursive Definitions
 IN [5
, 2002
"... In type theory based logical frameworks, recursive and corecursive definitions are subject to syntactic restrictions that ensure their termination and productivity. These restrictions however greately decrease the expressive power of the language. In this work we propose a general approach for s ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
In type theory based logical frameworks, recursive and corecursive definitions are subject to syntactic restrictions that ensure their termination and productivity. These restrictions however greately decrease the expressive power of the language. In this work we propose a general approach for systematically defining fixed points for a broad class of well given recursive definition. This approach unifies the ones based on wellfounded order to the ones based on complete metrics and contractive functions, thus allowing for mixed recursive/corecursive definitions.
Generalized Iteration and Coiteration for HigherOrder Nested Datatypes
 PROC. OF FOSSACS 2003
, 2003
"... We solve the problem of extending Bird and Paterson's generalized folds for nested datatypes and its dual to inductive and coinductive constructors of arbitrarily high ranks by appropriately generalizing Mendlerstyle (co)iteration. Characteristically to Mendlerstyle schemes of disciplined (co)recu ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
We solve the problem of extending Bird and Paterson's generalized folds for nested datatypes and its dual to inductive and coinductive constructors of arbitrarily high ranks by appropriately generalizing Mendlerstyle (co)iteration. Characteristically to Mendlerstyle schemes of disciplined (co)recursion, the schemes we propose do not rest on notions like positivity or monotonicity of a constructor and facilitate programming in a natural and elegant style close to programming with the customary letrec construct, where the typings of the schemes, however, guarantee termination. For rank 2, a smoothened version of Bird and Paterson's generalized folds and its dual are achieved; for rank 1, the schemes instantiate to Mendler's original (re)formulation of iteration and coiteration. Several examples demonstrate the power of the approach. Strong normalization of our proposed extension of system F of higherorder parametric polymorphism is proven by a reductionpreserving embedding into pure F .
Constructor subtyping
, 1999
"... Constructor subtyping is a form of subtyping in which an inductive type is viewed as a subtype of another inductive type Ï if Ï has more constructors than. As suggested in [5, 12], its (potential) uses include proof assistants and functional programming languages. In this paper, we introduce and ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Constructor subtyping is a form of subtyping in which an inductive type is viewed as a subtype of another inductive type Ï if Ï has more constructors than. As suggested in [5, 12], its (potential) uses include proof assistants and functional programming languages. In this paper, we introduce and study the properties of a simply typed Î»calculus with record types and datatypes, and which supports record subtyping and constructor subtyping. In the first part of the paper, we show that the calculus is confluent and strongly normalizing. In the second part of the paper, we show that the calculus admits a wellbehaved theory of canonical inhabitants, provided one adopts expansive extensionality rules, includingexpansion, surjective pairing, and a suitable expansion rule for datatypes. Finally, in the third part of the paper, we extend our calculus with unbounded recursion and show that confluence is preserved.
Implementing a normalizer using sized heterogeneous types
 In Workshop on Mathematically Structured Functional Programming, MSFP
, 2006
"... In the simplytyped lambdacalculus, a hereditary substitution replaces a free variable in a normal form r by another normal form s of type a, removing freshly created redexes on the fly. It can be defined by lexicographic induction on a and r, thus, giving rise to a structurally recursive normalize ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
In the simplytyped lambdacalculus, a hereditary substitution replaces a free variable in a normal form r by another normal form s of type a, removing freshly created redexes on the fly. It can be defined by lexicographic induction on a and r, thus, giving rise to a structurally recursive normalizer for the simplytyped lambdacalculus. We generalize this scheme to simultaneous substitutions, preserving its simple termination argument. We further implement hereditary simultaneous substitutions in a functional programming language with sized heterogeneous inductive types, Fωb, arriving at an interpreter whose termination can be tracked by the type system of its host programming language.