Results 1  10
of
49
Fast and Loose Reasoning is Morally Correct
, 2006
"... Functional programmers often reason about programs as if they were written in a total language, expecting the results to carry over to nontotal (partial) languages. We justify such reasoning. ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
Functional programmers often reason about programs as if they were written in a total language, expecting the results to carry over to nontotal (partial) languages. We justify such reasoning.
Observational Equality, Now!
 A SUBMISSION TO PLPV 2007
, 2007
"... This paper has something new and positive to say about propositional equality in programming and proof systems based on the CurryHoward correspondence between propositions and types. We have found a way to present a propositional equality type • which is substitutive, allowing us to reason by repla ..."
Abstract

Cited by 23 (8 self)
 Add to MetaCart
This paper has something new and positive to say about propositional equality in programming and proof systems based on the CurryHoward correspondence between propositions and types. We have found a way to present a propositional equality type • which is substitutive, allowing us to reason by replacing equal for equal in propositions; • which reflects the observable behaviour of values rather than their construction: in particular, we have extensionality— functions are equal if they take equal inputs to equal outputs; • which retains strong normalisation, decidable typechecking and canonicity—the property that closed normal forms inhabiting datatypes have canonical constructors; • which allows inductive data structures to be expressed in terms of a standard characterisation of wellfounded trees; • which is presented syntactically—you can implement it directly, and we are doing so—this approach stands at the core of Epigram 2; • which you can play with now: we have simulated our system by a shallow embedding in Agda 2, shipping as part of the standard examples package for that system [20]. Until now, it has always been necessary to sacrifice some of these aspects. The closest attempt in the literature is Altenkirch’s construction of a setoidmodel for a system with canonicity and extensionality on top of an intensional type theory with proofirrelevant propositions [4]. Our new proposal simplifies Altenkirch’s construction by adopting McBride’s heterogeneous approach to equality [18].
A Universe of Binding and Computation
"... We construct a logical framework supporting datatypes that mix binding and computation, implemented as a universe in the dependently typed programming language Agda 2. We represent binding pronominally, using wellscoped de Bruijn indices, so that types can be used to reason about the scoping of var ..."
Abstract

Cited by 17 (5 self)
 Add to MetaCart
We construct a logical framework supporting datatypes that mix binding and computation, implemented as a universe in the dependently typed programming language Agda 2. We represent binding pronominally, using wellscoped de Bruijn indices, so that types can be used to reason about the scoping of variables. We equip our universe with datatypegeneric implementations of weakening, substitution, exchange, contraction, and subordinationbased strengthening, so that programmers need not reimplement these operations for each individual language they define. In our mixed, pronominal setting, weakening and substitution hold only under some conditions on types, but we show that these conditions can be discharged automatically in many cases. Finally, we program a variety of standard difficult test cases from the literature, such as normalizationbyevaluation for the untyped λcalculus, demonstrating that we can express detailed invariants about variable usage in a program’s type while still writing clean and clear code.
Symmetric Lenses
"... Lenses—bidirectional transformations between pairs of connected structures—have been extensively studied and are beginning to find their way into industrial practice. However, some aspects of their foundations remain poorly understood. In particular, most previous work has focused on the special cas ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
Lenses—bidirectional transformations between pairs of connected structures—have been extensively studied and are beginning to find their way into industrial practice. However, some aspects of their foundations remain poorly understood. In particular, most previous work has focused on the special case of asymmetric lenses, where one of the structures is taken as primary and the other is thought of as a projection, or view. A few studies have considered symmetric variants, where each structure contains information not present in the other, but these all lack the basic operation of composition. Moreover, while many domainspecific languages based on lenses have been designed, lenses have not been thoroughly studied from a more fundamental algebraic perspective. We offer two contributions to the theory of lenses. First, we present a new symmetric formulation, based on complements, an old idea from the database literature. This formulation generalizes the familiar structure of asymmetric lenses, and it admits a good notion of composition. Second, we explore the algebraic structure of the space of symmetric lenses. We present generalizations of a number of known constructions on asymmetric lenses and settle some longstanding questions about their properties—in particular, we prove the existence of (symmetric monoidal) tensor products and sums and the nonexistence of full categorical products or sums in the category of symmetric lenses. We then show how the methods of universal algebra can be applied to build iterator lenses for structured data such as lists and trees, yielding lenses for operations like mapping, filtering, and concatenation from first principles. Finally, we investigate an even more general technique for constructing mapping combinators, based on the theory of containers. 1.
Clowns to the left of me, jokers to the right (pearl): dissecting data structures
 Proceedings of the 35th ACM SIGPLANSIGACT Symposium on Principles of Programming Languages
, 2008
"... This paper introduces a small but useful generalisation to the ‘derivative ’ operation on datatypes underlying Huet’s notion of ‘zipper ’ (Huet 1997; McBride 2001; Abbott et al. 2005b), giving a concrete representation to onehole contexts in data which is undergoing transformation. This operator, ‘ ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
This paper introduces a small but useful generalisation to the ‘derivative ’ operation on datatypes underlying Huet’s notion of ‘zipper ’ (Huet 1997; McBride 2001; Abbott et al. 2005b), giving a concrete representation to onehole contexts in data which is undergoing transformation. This operator, ‘dissection’, turns a containerlike functor into a bifunctor representing a onehole context in which elements to the left of the hole are distinguished in type from elements to its right. I present dissection here as a generic program, albeit for polynomial functors only. The notion is certainly applicable more widely, but here I prefer to concentrate on its diverse applications. For a start, maplike operations over the functor and foldlike operations over the recursive data structure it induces can be expressed by tail recursion alone. Further, the derivative is readily recovered from the dissection. Indeed, it is the dissection structure which delivers Huet’s operations for navigating zippers. The original motivation for dissection was to define ‘division’, capturing the notion of leftmost hole, canonically distinguishing values with no elements from those with at least one. Division gives rise to an isomorphism corresponding to the remainder theorem in algebra. By way of a larger example, division and dissection are exploited to give a relatively efficient generic algorithm for abstracting all occurrences of one term from another in a firstorder syntax. The source code for the paper is available online 1 and compiles with recent extensions to the Glasgow Haskell Compiler.
Constructing strictly positive families
 In The Australasian Theory Symposium (CATS2007
, 2007
"... We present an inductive definition of a universe containing codes for strictly positive families (SPFs) such as vectors or simply typed lambda terms. This construction extends the usual definition of inductive strictly positive types as given in previous joint work with McBride. We relate this to In ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We present an inductive definition of a universe containing codes for strictly positive families (SPFs) such as vectors or simply typed lambda terms. This construction extends the usual definition of inductive strictly positive types as given in previous joint work with McBride. We relate this to Indexed Containers, which were recently proposed in joint work with Ghani, Hancock and McBride. We demonstrate by example how dependent types can be encoded in this universe and give examples for generic programs.
Continuous functions on final coalgebras
, 2007
"... In a previous paper we have given a representation of continuous functions on streams, both discretevalued functions, and functions between streams. the topology on streams is the ‘Baire ’ topology induced by taking as a basic neighbourhood the set of streams that share a given finite prefix. We ga ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
In a previous paper we have given a representation of continuous functions on streams, both discretevalued functions, and functions between streams. the topology on streams is the ‘Baire ’ topology induced by taking as a basic neighbourhood the set of streams that share a given finite prefix. We gave also a combinator on the representations of stream processing functions that reflects composition. Streams are the simplest example of a nontrivial final coalgebras, playing in the coalgebraic realm the same role as do the natural numbers in the algebraic realm. Here we extend our previous results to cover the case of final coalgebras for a broad class of functors generalising (×A). The functors we deal with are those that arise from countable signatures of finiteplace untyped operators. These have many applications. The topology we put on the final coalgebra for such a functor is that induced by taking for basic neighbourhoods the set of infinite objects which share a common prefix, according to the usual definition of the final coalgebra as the limit of a certain inverse chain starting at �. 1
Foundational, Compositional (Co)datatypes for HigherOrder Logic  Category Theory Applied to Theorem Proving
"... Higherorder logic (HOL) forms the basis of several popular interactive theorem provers. These follow the definitional approach, reducing highlevel specifications to logical primitives. This also applies to the support for datatype definitions. However, the internal datatype construction used in H ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
Higherorder logic (HOL) forms the basis of several popular interactive theorem provers. These follow the definitional approach, reducing highlevel specifications to logical primitives. This also applies to the support for datatype definitions. However, the internal datatype construction used in HOL4, HOL Light, and Isabelle/HOL is fundamentally noncompositional, limiting its efficiency and flexibility, and it does not cater for codatatypes. We present a fully modular framework for constructing (co)datatypes in HOL, with support for mixed mutual and nested (co)recursion. Mixed (co)recursion enables type definitions involving both datatypes and codatatypes, such as the type of finitely branching trees of possibly infinite depth. Our framework draws heavily from category theory. The key notion is that of a rich type constructor—a functor satisfying specific properties preserved by interesting categorical operations. Our ideas are formalized in Isabelle and implemented as a new definitional package, answering a longstanding user request.
∂ for Data: Differentiating Data Structures
"... This paper and our conference paper (Abbott, Altenkirch, Ghani, and McBride, 2003b) explain and analyse the notion of the derivative of a data structure as the type of its onehole contexts based on the central observation made by McBride (2001). To make the idea precise we need a generic notion of ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
This paper and our conference paper (Abbott, Altenkirch, Ghani, and McBride, 2003b) explain and analyse the notion of the derivative of a data structure as the type of its onehole contexts based on the central observation made by McBride (2001). To make the idea precise we need a generic notion of a data type, which leads to the notion of a container, introduced in (Abbott, Altenkirch, and Ghani, 2003a) and investigated extensively in (Abbott, 2003). Using containers we can provide a notion of linear map which is the concept missing from McBride’s first analysis. We verify the usual laws of differential calculus including the chain rule and establish laws for initial algebras and terminal coalgebras.
Generic programming with dependent types
 Spring School on Datatype Generic Programming
, 2006
"... In these lecture notes we give an overview of recent research on the relationship ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
In these lecture notes we give an overview of recent research on the relationship