Results 1  10
of
18
Wellfounded Recursion with Copatterns A Unified Approach to Termination and Productivity
, 2013
"... In this paper, we study strong normalization of a core language based on System Fomega which supports programming with finite and infinite structures. Building on our prior work, finite data such as finite lists and trees are defined via constructors and manipulated via pattern matching, while infi ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
(Show Context)
In this paper, we study strong normalization of a core language based on System Fomega which supports programming with finite and infinite structures. Building on our prior work, finite data such as finite lists and trees are defined via constructors and manipulated via pattern matching, while infinite data such as streams and infinite trees is defined by observations and synthesized via copattern matching. In this work, we take a typebased approach to strong normalization by tracking size information about finite and infinite data in the type. This guarantees compositionality. More importantly, the duality of pattern and copatterns provide a unifying semantic concept which allows us for the first time to elegantly and uniformly support both wellfounded induction and coinduction by mere rewriting. The strong normalization proof is structured around Girard’s reducibility candidates. As such our system allows for nondeterminism and does not rely on coverage. Since System Fomega is general enough that it can be the target of compilation for the Calculus of Constructions, this work is a significant step towards representing observationcentric infinite data in proof assistants such as Coq and Agda.
Witnessing (Co)datatypes
"... Abstract. Datatypes and codatatypes are very useful for specifying and reasoning about (possibly infinite) computational processes. The interactive theorem prover Isabelle/HOL has been extended with a definitional package that supports both. Here we describe a complete procedure for deriving nonempt ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Abstract. Datatypes and codatatypes are very useful for specifying and reasoning about (possibly infinite) computational processes. The interactive theorem prover Isabelle/HOL has been extended with a definitional package that supports both. Here we describe a complete procedure for deriving nonemptiness witnesses in the general mutually recursive, nested case—nonemptiness being a proviso for introducing new types in higherorder logic. The nonemptiness problem also provides an illuminating case study that shows the package in action, tracing its journey from abstract category theory to handson functionality. 1
Typebased productivity of stream definitions in the calculus of constructions
 In LICS’13
, 2013
"... Abstract—Productivity of corecursive definitions is an essential property in proof assistants since it ensures logical consistency and decidability of type checking. Typebased mechanisms for ensuring productivity use types annotated with size information to track the number of elements produced in ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract—Productivity of corecursive definitions is an essential property in proof assistants since it ensures logical consistency and decidability of type checking. Typebased mechanisms for ensuring productivity use types annotated with size information to track the number of elements produced in corecursive definitions. In this paper, we propose an extension of the Calculus of Constructions—the theory underlying the Coq proof assistant—with a typebased criterion for ensuring productivity of stream definitions. We prove strong normalization and logical consistency. Furthermore, we define an algorithm for inferring size annotations in types. These results can be easily extended to handle general coinductive types. I.
Unnesting of Copatterns
"... Abstract. Inductive data such as finite lists and trees can elegantly be defined by constructors which allow programmers to analyze and manipulate finite data via pattern matching. Dually, coinductive data such as streams can be defined by observations such as head and tail and programmers can syn ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Inductive data such as finite lists and trees can elegantly be defined by constructors which allow programmers to analyze and manipulate finite data via pattern matching. Dually, coinductive data such as streams can be defined by observations such as head and tail and programmers can synthesize infinite data via copattern matching. This leads to a symmetric language where finite and infinite data can be nested. In this paper, we compile nested pattern and copattern matching into a core language which only supports simple nonnested (co)pattern matching. This core language may serve as an intermediate language of a compiler. We show that this translation is conservative, i.e., the multistep reduction relation in both languages coincides for terms of the original language. Furthermore, we show that the translation preserves strong normalisation: a term of the original language is strongly normalising in one language if and only if it is so in the other.
Foundational Extensible Corecursion
, 2014
"... This paper presents a theoretical framework for defining corecursive functions safely in a total setting, based on corecursion upto and relational parametricity. The end product is a general corecursor that allows corecursive (and even recursive) calls under wellbehaved operations, including con ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper presents a theoretical framework for defining corecursive functions safely in a total setting, based on corecursion upto and relational parametricity. The end product is a general corecursor that allows corecursive (and even recursive) calls under wellbehaved operations, including constructors. Corecursive functions that are well behaved can be registered as such, thereby increasing the corecursor’s expressiveness. To the extensible corecursor corresponds an equally flexible coinduction principle. The metatheory is formalized in the Isabelle proof assistant and forms the core of a prototype tool. The approach is foundational: The corecursor is derived from first principles, without requiring new axioms or extensions of the logic. This ensures that no inconsistencies can be introduced by omissions in a termination or productivity check.
How to reason informally coinductively
"... In our article [1] we introduced the representation of final coalgebras, which correspond to nonwellfounded data structures, as defined by their elimination rules rather than by their introduction rules. When determined by their introduction rules, elements of final coalgebras are given by possibl ..."
Abstract
 Add to MetaCart
In our article [1] we introduced the representation of final coalgebras, which correspond to nonwellfounded data structures, as defined by their elimination rules rather than by their introduction rules. When determined by their introduction rules, elements of final coalgebras are given by possibly nonwellfounded many applications of the constructors. For instance a stream is given by an infinite application of the cons operation, cons n1 (cons n2 (cons n3 · · ·)). Then increasing stream starting with n is given as inc n = cons n (inc (n + 1)) which reduces to inc n = cons n (cons (n + 1) (cons (n + 2) ( · · ·))). The problem is that this results in nonnormalisation and proper infinite terms. When defined by their introduction rules, a coalgebra is given by the result of applying destructors (eliminators to it). For instance a stream is given by applying the operations head: Stream → N and tail: Stream → Stream to it. As an example we have head (inc n) = n and tail (inc n) = inc (n + 1). The problem of nonnormalisation disappears under certain restrictions, for instance inc n is in normal form, unfolding its infinite nature requires repeated applications of tail to it. Coalgebras are given as weakly final or as final coalgebras for a functor F. For instance the set of streams is given as a final coalgebra for the functor F: Set → Set, where Set is the category of sets, with object part F(X) = N × X. This means that there exists a function Stream → F(Stream) (which is just 〈head, tail〉, and for any other coalgebra f: X → F(X) there exists a unique g: X → Stream such that the following diagram commutes: X f F(X)
How to Reason Coinductively Informally
, 2015
"... We start by giving an overview of the theory of indexed inductively and coinductively defined sets. We consider the theory of strictly positive indexed inductive definitions in a set theoretic setting. We show the equivalence between the definition as an indexed initial algebra, the definition via ..."
Abstract
 Add to MetaCart
We start by giving an overview of the theory of indexed inductively and coinductively defined sets. We consider the theory of strictly positive indexed inductive definitions in a set theoretic setting. We show the equivalence between the definition as an indexed initial algebra, the definition via an induction principle, and the set theoretic definition of indexed inductive definitions. We review as well the equivalence of unique iteration, unique primitive recursion, and induction. Then we review the theory of indexed coinductively defined sets or final coalgebras. We construct indexed coinductively defined sets set theoretically, and show the equivalence between the category theoretic definition, the principle of unique coiteration, of unique corecursion, and of iteration together with bisimulation as equality. Bisimulation will be defined as an indexed coinductively defined set. Therefore proofs of bisimulation can be carried out corecursively. This fact can be considered together with bisimulation implying equality as the coinduction principle for the underlying coinductively defined set. Finally we introduce various schemata for reasoning about coinductively defined sets in an informal way: the schemata of corecursion, of indexed corecursion, of coinduction, and of corecursion for coinductively defined relations. This allows to reason about coinductively defined sets similarly as one does when reasoning about inductively defined sets using schemata of induction. We obtain the notion of a coinduction hypothesis, which is the dual of an induction hypothesis. 1
unknown title
, 2015
"... A model of guarded recursion with clock synchronisation ..."
(Show Context)
Witnessing (Co)datatypes
"... Datatypes and codatatypes are useful for specifying and reasoning about (possibly infinite) computational processes. The interactive theorem prover Isabelle/HOL has recently been extended with a definitional package that supports both. Here we describe a complete procedure for deriving nonemptiness ..."
Abstract
 Add to MetaCart
Datatypes and codatatypes are useful for specifying and reasoning about (possibly infinite) computational processes. The interactive theorem prover Isabelle/HOL has recently been extended with a definitional package that supports both. Here we describe a complete procedure for deriving nonemptiness witnesses in the general mutually recursive, nested case—nonemptiness being a proviso for introducing new types in higherorder logic. The nonemptiness problem also provides an illuminating case study that shows the package in action, tracing its journey from abstract category theory to handson functionality.