Results 1  10
of
13
Deliverables: A Categorical Approach to Program Development in Type Theory
, 1992
"... This thesis considers the problem of program correctness within a rich theory of dependent types, the Extended Calculus of Constructions (ECC). This system contains a powerful programming language of higherorder primitive recursion and higherorder intuitionistic logic. It is supported by Pollack&a ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
This thesis considers the problem of program correctness within a rich theory of dependent types, the Extended Calculus of Constructions (ECC). This system contains a powerful programming language of higherorder primitive recursion and higherorder intuitionistic logic. It is supported by Pollack's versatile LEGO implementation, which I use extensively to develop the mathematical constructions studied here. I systematically investigate Burstall's notion of deliverable, that is, a program paired with a proof of correctness. This approach separates the concerns of programming and logic, since I want a simple program extraction mechanism. The \Sigmatypes of the calculus enable us to achieve this. There are many similarities with the subset interpretation of MartinLof type theory. I show that deliverables have a rich categorical structure, so that correctness proofs may be decomposed in a principled way. The categorical combinators which I define in the system package up much logical bo...
A Reification Calculus for ModelOriented Software Specification
 FORMAL ASPECTS OF COMPUTING
, 1990
"... This paper presents a transformational approach to the derivation of implementations from modeloriented specifications of abstract data types. The purpose of this research is to reduce the number of formal proofs required in model refinement, which hinder software development. It is shown to be ap ..."
Abstract

Cited by 23 (11 self)
 Add to MetaCart
This paper presents a transformational approach to the derivation of implementations from modeloriented specifications of abstract data types. The purpose of this research is to reduce the number of formal proofs required in model refinement, which hinder software development. It is shown to be applicable to the transformation of models written in Metaiv (the specification language of Vdm) towards their refinement into, for example, Pascal or relational DBMSs. The approach includes the automatic synthesis of retrieve functions between models, and datatype invariants. The underlying algebraic semantics is the socalled final semantics “à la Wand”: a specification “is ” a model (heterogeneous algebra) which is the final object (up to isomorphism) in the category of all its implementations. The transformational calculus approached in this paper follows from exploring the properties of finite, recursively defined sets. This work extends the wellknown strategy of program transformation to model transformation, adding to previous work on a transformational style for operationdecomposition in METAIV. The modelcalculus is also useful for improving modeloriented specifications.
An Axiomatic Approach to Binary Logical Relations with Applications to Data Refinement
 Proc. TACS'97, Springer LNCS 1281
, 1997
"... We introduce an axiomatic approach to logical relations and data refinement. We consider a programming language and the monad on the category of small categories generated by it. We identify abstract data types for the language with sketches for the associated monad, and define an axiomatic notion o ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
We introduce an axiomatic approach to logical relations and data refinement. We consider a programming language and the monad on the category of small categories generated by it. We identify abstract data types for the language with sketches for the associated monad, and define an axiomatic notion of "relation" between models of such a sketch in a semantic category. We then prove three results: (i) such models lift to the whole language together with the sketch; (ii) any such relation satisfies a soundness condition, and (iii) such relations compose. We do this for both equality of data representations and for an ordered version. Finally, we compare our formulation of data refinement with that of Hoare. This work has been done with the support of the MITI Cooperative Architecture Project. This author also acknowledges the support of Kakenhi. y This author achnowledges the support of the MITI Cooperative Architecture Project. z This author acknowledges the support of EPSRC grant...
Encoding, Decoding, and Data Refinement
 FORMAL ASPECTS OF COMPUTING
, 1999
"... Data refinement is the systematic replacement of a data structure with another one in program development. Data refinement between program statements can on an abstract level be described as a commutativity property where the abstraction relationship between the data structures involved is represent ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Data refinement is the systematic replacement of a data structure with another one in program development. Data refinement between program statements can on an abstract level be described as a commutativity property where the abstraction relationship between the data structures involved is represented by an abstract statement (a decoding). We generalise the traditional notion of data refinement by defining an encoding operator that describes the least (most abstract) data refinement with respect to a given abstraction. We investigate the categorical and algebraic properties of encoding and describe a number of special cases, which include traditional notions of data refinement. The dual operator of encoding is decoding, which we investigate and give an intuitive interpretation to. Finally we show a number of applications of encoding and decoding.
Compositional Characterization of Observable Program Properties
"... In this paper we model both program behaviours and abstractions between them as lax functors, which generalize abstract interpretations by exploiting the natural ordering of program properties. This generalization provides a framework in which correctness (safety) and completeness of abstract interp ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
In this paper we model both program behaviours and abstractions between them as lax functors, which generalize abstract interpretations by exploiting the natural ordering of program properties. This generalization provides a framework in which correctness (safety) and completeness of abstract interpretations naturally arise from this order. Furthermore, it supports modular and stepwise refinement: given a program behaviour, its characterization, which is a "best" correct and complete denotational semantics for it, can be determined in a compositional way. University of Aarhus, Denmark y LFCS, University of Edinburgh, Scotland z Universitat Erlangen, Germany 1 Introduction Abstract interpretation is a method for analyzing program behaviours, i.e. the relationship between programs and their observable properties [CC77a, CC77b, Nie86, AH87, JN90]. It abstracts from standard (denotational) semantics for programming languages to nonstandard semantics, which are intended to retain c...
Data refinement, call by value, and higher order programs
, 2003
"... Using 2categorical laws of algorithmic refinement, we show soundness of data refinement for stored programs and hence for higher order procedures with value/result parameters. The refinement laws hold in a model that slightly generalizes the standard predicate transformer semantics for the usual ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Using 2categorical laws of algorithmic refinement, we show soundness of data refinement for stored programs and hence for higher order procedures with value/result parameters. The refinement laws hold in a model that slightly generalizes the standard predicate transformer semantics for the usual imperative programming constructs including prescriptions.
Blaming the Client: On Data Refinement in the Presence of Pointers
 TO APPEAR IN FORMAL ASPECTS OF COMPUTING
"... Data refinement is a common approach to reasoning about programs, based on establishing that a concrete program indeed satisfies all the required properties imposed by an intended abstract pattern. Reasoning about programs in this setting becomes complex when use of pointers is assumed and, moreove ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Data refinement is a common approach to reasoning about programs, based on establishing that a concrete program indeed satisfies all the required properties imposed by an intended abstract pattern. Reasoning about programs in this setting becomes complex when use of pointers is assumed and, moreover, a wellknown method for proving data refinement, namely the forward simulation method, becomes unsound in presence of pointers. The reason for unsoundness is the failure of the “lifting theorem” for simulations: that a simulation between abstract and concrete modules can be lifted to all client programs. The result is that simulation does not imply that a concrete can replace an abstract module in all contexts. Our diagnosis of this problem is that unsoundness is due to interference from the client programs. Rather than blame a module for the unsoundness of lifting simulations, our analysis places the blame on the client programs which cause the interference: when interference is not present, soundness is recovered. Technically, we present a novel instrumented semantics which is capable of detecting interference between a module and its client. With use of special simulation relations, namely growing relations, and interpreting the simulation method using the instrumented semantics, we obtain a lifting theorem. We then show situations under which simulation does indeed imply refinement.
Axiomatics for Data Refinement in Call By Value Programming Languages
"... We give a systematic category theoretic axiomatics for modelling data refinement in call by value programming languages. Our leading examples of call by value languages are extensions of the computational calculus, such as FPC and languages for modelling nondeterminism, and extensions of the first ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We give a systematic category theoretic axiomatics for modelling data refinement in call by value programming languages. Our leading examples of call by value languages are extensions of the computational calculus, such as FPC and languages for modelling nondeterminism, and extensions of the first order fragment of the computational calculus, such as a CPS language. We give a category theoretic account of the basic setting, then show how to model contexts, then arbitrary type and term constructors, then signatures, and finally data refinement. This extends and clarifies Kinoshita and Power's work on lax logical relations for call by value languages.
A TypeTheoretic Analysis of Modular Specifications
, 1996
"... We study the problem of representing a modular specification language in a typetheory based theorem prover. Our goals are: to provide mechanical support for reasoning about specifications and about the specification language itself; to clarify the semantics of the specification language by formalis ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We study the problem of representing a modular specification language in a typetheory based theorem prover. Our goals are: to provide mechanical support for reasoning about specifications and about the specification language itself; to clarify the semantics of the specification language by formalising them fully; to augment the specification language with a programming language in a setting where they are both part of the same formal environment, allowing us to define a formal implementation relationship between the two. Previous work on similar issues has given rise to a dichotomy between “shallow ” and “deep ” embedding styles when representing one language within another. We show that the expressiveness of type theory, and the high degree of reflection that it permits, allow us to develop embedding techniques which lie between the “shallow ” and “deep ” extremes. We consider various possible embedding strategies and then choose one of them to explore more fully. As our object of study we choose a fragment of the Z specification language, which we encode in the type theory UTT, as implemented in the LEGO proofchecker. We use the encoding to study some of the operations on schemas provided by Z. One of our main concerns is whether it is possible to reason about Z specifications at the level of these operations. We prove some theorems about Z showing that, within certain constraints, this kind of reasoning is indeed possible. We then show how these metatheorems can be used to carry out formal reasoning about Z specifications. For this we make use of an example taken from the Z Reference Manual (ZRM). Finally, we exploit the fact that type theory provides a programming language as well as a logic to define a notion of implementation for Z specifications. We illustrate this by encoding some example programs taken from the ZRM. ii Declaration I declare that this thesis was composed by myself, and that the work contained in it is my own except where otherwise stated. Some of this work has been published previously [Mah94]. iii