Results 1  10
of
10
Statedependent representation independence
 In Proceedings of the 36th ACM SIGPLANSIGACT Symposium on Principles of Programming Languages
, 2009
"... Mitchell’s notion of representation independence is a particularly useful application of Reynolds ’ relational parametricity — two different implementations of an abstract data type can be shown contextually equivalent so long as there exists a relation between their type representations that is pre ..."
Abstract

Cited by 63 (19 self)
 Add to MetaCart
Mitchell’s notion of representation independence is a particularly useful application of Reynolds ’ relational parametricity — two different implementations of an abstract data type can be shown contextually equivalent so long as there exists a relation between their type representations that is preserved by their operations. There have been a number of methods proposed for proving representation independence in various pure extensions of System F (where data abstraction is achieved through existential typing), as well as in Algol or Javalike languages (where data abstraction is achieved through the use of local mutable state). However, none of these approaches addresses the interaction of existential type abstraction and local state. In particular, none allows one to prove representation independence results for generative ADTs — i.e., ADTs that both maintain some local state and define abstract types whose internal
A Generic Operational Metatheory for Algebraic Effects ∗
"... We provide a syntactic analysis of contextual preorder and equivalence for a polymorphic programming language with effects. Our approach applies uniformly to arbitrary algebraic effects, and thus incorporates, as instances: errors, input/output, global state, nondeterminism, probabilistic choice, an ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We provide a syntactic analysis of contextual preorder and equivalence for a polymorphic programming language with effects. Our approach applies uniformly to arbitrary algebraic effects, and thus incorporates, as instances: errors, input/output, global state, nondeterminism, probabilistic choice, and combinations thereof. Our approach is to extend Plotkin and Power’s structural operational semantics for algebraic effects (FoSSaCS 2001) with a primitive “basic preorder ” on ground type computation trees. The basic preorder is used to derive notions of contextual preorder and equivalence on program terms. Under mild assumptions on this relation, we prove fundamental properties of contextual preorder (hence equivalence) including extensionality properties, a characterisation via applicative contexts, and machinery for reasoning about polymorphism using relational parametricity. 1.
Proving correctness via free theorems: The case of the destroy/buildrule
 IN PARTIAL EVALUATION AND PROGRAM MANIPULATION, PROCEEDINGS
, 2008
"... Free theorems feature prominently in the field of program transformation for pure functional languages such as Haskell. However, somewhat disappointingly, the semantic properties of so based transformations are often established only very superficially. This paper is intended as a case study showing ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Free theorems feature prominently in the field of program transformation for pure functional languages such as Haskell. However, somewhat disappointingly, the semantic properties of so based transformations are often established only very superficially. This paper is intended as a case study showing how to use the existing theoretical foundations and formal methods for improving the situation. To that end, we investigate the correctness issue for a new transformation rule in the short cut fusion family. This destroy/buildrule provides a certain reconciliation between the competing foldr/build and destroy/unfoldrapproaches to eliminating intermediate lists. Our emphasis is on systematically and rigorously developing the rule’s correctness proof, even while paying attention to semantic aspects like potential nontermination and mixed strict/nonstrict evaluation.
A family of syntactic logical relations for the semantics of Haskelllike languages
 INFORMATION AND COMPUTATION
, 2009
"... Logical relations are a fundamental and powerful tool for reasoning about programs in languages with parametric polymorphism. Logical relations suitable for reasoning about observational behavior in polymorphic calculi supporting various programming language features have been introduced in recent y ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Logical relations are a fundamental and powerful tool for reasoning about programs in languages with parametric polymorphism. Logical relations suitable for reasoning about observational behavior in polymorphic calculi supporting various programming language features have been introduced in recent years. Unfortunately, the calculi studied are typically idealized, and the results obtained for them offer only partial insight into the impact of such features on observational behavior in implemented languages. In this paper we show how to bring reasoning via logical relations closer to bear on real languages by deriving results that are more pertinent to an intermediate language for the (mostly) lazy functional language Haskell like GHC Core. To provide a more finegrained analysis of program behavior than is possible by reasoning about program equivalence alone, we work with an abstract notion of relating observational behavior of computations which has among its specializations both observational equivalence and observational approximation. We take selective strictness into account, and we consider the impact of different kinds of
Parametricity for Haskell with Imprecise Error Semantics
"... Types play an important role both in reasoning about Haskell and for its implementation. For example, the Glasgow Haskell Compiler performs certain fusion transformations that are intended to improve program efficiency and whose semantic justification is derived from polymorphic function types. At t ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Types play an important role both in reasoning about Haskell and for its implementation. For example, the Glasgow Haskell Compiler performs certain fusion transformations that are intended to improve program efficiency and whose semantic justification is derived from polymorphic function types. At the same time, GHC adopts a scheme of error raising, propagation, and handling which is nondeterministic in the sense that there is some freedom as to which of a number of potential failure events hidden somewhere in a program is actually triggered. Implemented for good pragmatic reasons, this scheme complicates the meaning of programs and thus necessitates extra care when reasoning about them. In particular, since every erroneous value now represents a whole set of potential (but not arbitrary) failure causes, and since the associated propagation rules are askew to standard notions of program flow and value dependence, some standard laws suddenly fail to hold. This includes laws derived from polymorphic types, popularized as free theorems and at the base of the mentioned kind of fusion. We study this interaction between typebased reasoning and imprecise errors by revising and extending the foundational notion of relational parametricity, as well as further material required to make it applicable. More generally, we believe that our development and proofs help direct the way for incorporating further and other extensions and semantic features that deviate from the “naive ” setting in which reasoning about Haskell programs often takes place.
Free Theorems for Functional Logic Programs
 SUBMITTED TO PLPV’10
, 2010
"... Typebased reasoning is popular in functional programming. In particular, parametric polymorphism constrains functions in such a way that statements about their behavior can be derived without consulting function definitions. Is the same possible in a strongly, and polymorphically, typed functional ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Typebased reasoning is popular in functional programming. In particular, parametric polymorphism constrains functions in such a way that statements about their behavior can be derived without consulting function definitions. Is the same possible in a strongly, and polymorphically, typed functional logic language? This is the question we study in this paper. Logical features like nondeterminism and free variables cause interesting effects, which we examine based on examples and address by identifying appropriate conditions that guarantee standard free theorems or inequational versions thereof to hold.
Contextual Equivalence in LambdaCalculi extended with letrec and with a Parametric Polymorphic Type System
, 2009
"... This paper describes a method to treat contextual equivalence in polymorphically typed lambdacalculi, and also how to transfer equivalences from the untyped versions of lambdacalculi to their typed variant, where our specific calculus has letrec, recursive types and is nondeterministic. An additio ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper describes a method to treat contextual equivalence in polymorphically typed lambdacalculi, and also how to transfer equivalences from the untyped versions of lambdacalculi to their typed variant, where our specific calculus has letrec, recursive types and is nondeterministic. An addition of a type label to every subexpression is all that is needed, together with some natural constraints for the consistency of the type labels and wellscopedness of expressions. One result is that an elementary but typed notion of program transformation is obtained and that untyped contextual equivalences also hold in the typed calculus as long as the expressions are welltyped. In order to have a nice interaction between reduction and typing, some reduction rules have to be accompanied with a type modification by generalizing or instantiating types.
Reasoning about Selective Strictness  Operational Equivalence, Heaps and CallbyNeed Evaluation, New Inductive Principles
, 2009
"... Many predominantly lazy languages now incorporate strictness enforcing primitives, for example a strict let or sequential composition seq. Reasons for doing this include gains in time or space efficiencies, or control of parallel evaluation. This thesis studies how to prove equivalences between pro ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Many predominantly lazy languages now incorporate strictness enforcing primitives, for example a strict let or sequential composition seq. Reasons for doing this include gains in time or space efficiencies, or control of parallel evaluation. This thesis studies how to prove equivalences between programs in languages with selective strictness, specifically, we use a restricted core lazy functional language with a selective strictness operator seq whose operational semantics is a variant of one considered by van Eckelen and de Mol, which itself was derived from Launchbury’s natural semantics for lazy evaluation. The main research contributions are as follows: We establish some of the first ever equivalences between programs with selective strictness. We do this by manipulating operational semantics derivations, in
A Trajectorybased Strict Semantics for Program Slicing
"... We define a program semantics that is preserved by dependencebased slicing algorithms. It is a natural extension, to nonterminating programs, of the semantics introduced by Weiser (which only considered terminating ones) and, as such, is an accurate characterisation of the semantic relationship be ..."
Abstract
 Add to MetaCart
We define a program semantics that is preserved by dependencebased slicing algorithms. It is a natural extension, to nonterminating programs, of the semantics introduced by Weiser (which only considered terminating ones) and, as such, is an accurate characterisation of the semantic relationship between a program and the slice produced by these algorithms. Unlike other approaches, apart from Weiser’s original one, it is based on strict standard semantics which models the ‘normal ’ execution of programs on a von Neumann machine and, thus, has the advantage of being intuitive. This is essential since one of the main applications of slicing is program comprehension. Although our semantics handles nontermination, it is defined wholly in terms of finite trajectories, without having to resort to complex, counterintuitive, nonstandard models of computation. As well as being simpler, unlike other approaches to this problem, our semantics is substitutive. Substitutivity is an important property because it greatly enhances the ability to reason about correctness of meaningpreserving
Reasoning about Selective Strictness  Operational Equivalence, Heaps and CallbyNeed Evaluation, New Inductive Principles
, 2009
"... This thesis studies how to prove equivalences between programs in languages with selective strictness, specifically, we use a restricted core lazy functional language with a selective strictness operator seq. We establish some of the first ever equivalences between lazy programs with selective str ..."
Abstract
 Add to MetaCart
This thesis studies how to prove equivalences between programs in languages with selective strictness, specifically, we use a restricted core lazy functional language with a selective strictness operator seq. We establish some of the first ever equivalences between lazy programs with selective strictness by manipulating operational semantics derivations. Our operational semantics is similar to that used by van Eekelen and De Mol, though we introduce a ‘garbagecollecting’ rule for (let) which turns out to cause expressiveness restrictions. For example, arguably reasonable lazy programs such as let y = λz.z in λx.y do not reduce in our operational semantics. We prove some properties of seq, including associativity, idempotence, and leftcommutativity. The proofs use our three notions of program equivalence defined