Results 1  10
of
77
Inductive Data Type Systems
 THEORETICAL COMPUTER SCIENCE
, 1997
"... In a previous work (“Abstract Data Type Systems”, TCS 173(2), 1997), the last two authors presented a combined language made of a (strongly normalizing) algebraic rewrite system and a typed λcalculus enriched by patternmatching definitions following a certain format, called the “General Schema”, w ..."
Abstract

Cited by 47 (10 self)
 Add to MetaCart
(Show Context)
In a previous work (“Abstract Data Type Systems”, TCS 173(2), 1997), the last two authors presented a combined language made of a (strongly normalizing) algebraic rewrite system and a typed λcalculus enriched by patternmatching definitions following a certain format, called the “General Schema”, which generalizes the usual recursor definitions for natural numbers and similar “basic inductive types”. This combined language was shown to be strongly normalizing. The purpose of this paper is to reformulate and extend the General Schema in order to make it easily extensible, to capture a more general class of inductive types, called “strictly positive”, and to ease the strong normalization proof of the resulting system. This result provides a computation model for the combination of an algebraic specification language based on abstract data types and of a strongly typed functional language with strictly positive inductive types.
Elimination with a Motive
 Types for Proofs and Programs (Proceedings of the International Workshop, TYPES’00), volume 2277 of LNCS
, 2002
"... I present a tactic, BasicElim, for Type Theory based proof systems to apply elimination rules in a refinement setting. Applicable rules are parametric in their conclusion, expressing the leverage hypotheses ~x yield on any \Phi ~x we choose. \Phi represents the motive for an elimination: BasicElim& ..."
Abstract

Cited by 38 (12 self)
 Add to MetaCart
(Show Context)
I present a tactic, BasicElim, for Type Theory based proof systems to apply elimination rules in a refinement setting. Applicable rules are parametric in their conclusion, expressing the leverage hypotheses ~x yield on any \Phi ~x we choose. \Phi represents the motive for an elimination: BasicElim's job is to construct a \Phi suited to the goal at hand. If these ~x inhabit an instance of \Phi's domain, I adopt a technique standard in `folklore', generalizing the ~x and expressing the restriction by equation. A novel notion of = readily permits dependent equations, and a second tactic, Unify, simpifies the equational hypotheses thus appearing in subgoals. Given such technology, it becomes effective to express properties of datatypes, relations and functions in this style. A small extension couples BasicElim with rewriting, allowing complex techniques to be packaged in a single rule. 1
Structural Recursive Definitions in Type Theory
 Automata, Languages and Programming, 25th International Colloquium, ICALP’98
, 1998
"... We introduce an extension of the Calculus of Construction with inductive and coinductive types that preserves strong normalisation for a lazy computation relation. This extension considerably enlarges the expressiveness of the language, enabling a direct translation of recursive programs, while kee ..."
Abstract

Cited by 37 (0 self)
 Add to MetaCart
We introduce an extension of the Calculus of Construction with inductive and coinductive types that preserves strong normalisation for a lazy computation relation. This extension considerably enlarges the expressiveness of the language, enabling a direct translation of recursive programs, while keeping a relatively simple collection of typing rules. 1 Introduction The last twenty five years have seen an increasing development of different proof environments based on type theory. Several type theories have been proposed as a foundation of such proof environments [15, 6, 16], trying to find an accurate compromise between two criteria. On the one hand, we search for extensions of type theory that preserve its conceptual simplicity of type theory (a few primitive constructions, a small number of typing rules) and metatheoretical properties ensuring its soundness and a direct mechanisation (strong normalisation, decidability of typechecking, etc). On the other hand, we would like to pro...
Coinductive bigstep operational semantics
, 2006
"... This paper illustrates the use of coinductive definitions and proofs in bigstep operational semantics, enabling the latter to describe diverging evaluations in addition to terminating evaluations. We show applications to proofs of type soundness and to proofs of semantic preservation for compilers ..."
Abstract

Cited by 36 (5 self)
 Add to MetaCart
(Show Context)
This paper illustrates the use of coinductive definitions and proofs in bigstep operational semantics, enabling the latter to describe diverging evaluations in addition to terminating evaluations. We show applications to proofs of type soundness and to proofs of semantic preservation for compilers.
General recursion via coinductive types
 Logical Methods in Computer Science
"... Vol. 1 (2:1) 2005, pp. 1–28 ..."
(Show Context)
Induction and coinduction in sequent calculus
 Postproceedings of TYPES 2003, number 3085 in LNCS
, 2003
"... Abstract. Proof search has been used to specify a wide range of computation systems. In order to build a framework for reasoning about such specifications, we make use of a sequent calculus involving induction and coinduction. These proof principles are based on a proof theoretic (rather than sett ..."
Abstract

Cited by 23 (8 self)
 Add to MetaCart
(Show Context)
Abstract. Proof search has been used to specify a wide range of computation systems. In order to build a framework for reasoning about such specifications, we make use of a sequent calculus involving induction and coinduction. These proof principles are based on a proof theoretic (rather than settheoretic) notion of definition [13, 20, 25, 51]. Definitions are akin to (stratified) logic programs, where the left and right rules for defined atoms allow one to view theories as “closed ” or defining fixed points. The use of definitions makes it possible to reason intensionally about syntax, in particular enforcing free equality via unification. We add in a consistent way rules for pre and post fixed points, thus allowing the user to reason inductively and coinductively about properties of computational system making full use of higherorder abstract syntax. Consistency is guaranteed via cutelimination, where we give the first, to our knowledge, cutelimination procedure in the presence of general inductive and coinductive definitions. 1
Filters on coinductive streams, an application to eratosthenes’ sieve
 Typed Lambda Calculi and Applications, 7th International Conference, TLCA 2005
, 2005
"... Our objective is to describe a formal proof of correctness for the following Haskell [13] program in a type theorybased proof verification system, such as the Coq system [10, 1]. sieve (p:rest) = p:sieve [r  r < rest, r ‘rem ‘ p / = 0] primes = sieve [2..] This program is a functional impleme ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
(Show Context)
Our objective is to describe a formal proof of correctness for the following Haskell [13] program in a type theorybased proof verification system, such as the Coq system [10, 1]. sieve (p:rest) = p:sieve [r  r < rest, r ‘rem ‘ p / = 0] primes = sieve [2..] This program is a functional implementation of Eratosthenes ’ sieve that consists in removing all multiples of previously found primes from the sequence of natural numbers. We want to prove that the expression primes is the stream containing all the prime numbers in increasing order. This work relies on coinductive types [5, 11, 12] because the program manipulates infinite lists, also known as streams. It first uses the infinite list of natural numbers larger than 2, then the infinite list of numbers larger than 3 and containing no multiples of 2, then the infinite list of numbers larger than 4 and containing no multiples of prime numbers smaller than 4, and so on. This example was initially proposed as a challenge by G. Kahn and used as an illustration of a program and its proof of correctness in a
A Fixedpoint Approach to (Co)Inductive and (Co)Datatype Definitions
, 1997
"... This paper presents a fixedpoint approach to inductive definitions. Instead of using a syntactic test such as "strictly positive," the approach lets definitions involve any operators that have been proved monotone. It is conceptually simple, which has allowed the easy implementation of ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
(Show Context)
This paper presents a fixedpoint approach to inductive definitions. Instead of using a syntactic test such as "strictly positive," the approach lets definitions involve any operators that have been proved monotone. It is conceptually simple, which has allowed the easy implementation of mutual recursion and iterated definitions. It also handles coinductive definitions: simply replace the least fixedpoint by a greatest fixedpoint. The method
Ensuring Streams Flow
 Proc. 6 th AMAST
, 1997
"... . It is our aim to develop an elementary strong functional programming (ESFP) system. To be useful, ESFP should include structures such as streams which can be computationally unwound infinitely often. We describe a syntactic analysis to ensure that infinitely proceeding structures, which we shall t ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
(Show Context)
. It is our aim to develop an elementary strong functional programming (ESFP) system. To be useful, ESFP should include structures such as streams which can be computationally unwound infinitely often. We describe a syntactic analysis to ensure that infinitely proceeding structures, which we shall term codata, are productive. This analysis is an extension of the check for guardedness that has been used with definitions over coinductive types in MartinLof's type theory and in the calculus of constructions. Our analysis is presented as a form of abstract interpretation that allows a wider syntactic class of corecursive definitions to be recognised as productive than in previous work. Thus programmers will have fewer restrictions on their use of infinite streams within a strongly normalizing functional language. 1 Introduction We aim to develop an Elementary Strong Functional Programming (ESFP) system. That is, we wish to exhibit a language that has the strong normalization (every progr...