Results 1  10
of
19
From reductionbased to reductionfree normalization
 Proceedings of the Fourth International Workshop on Reduction Strategies in Rewriting and Programming (WRS'04
, 2004
"... We document an operational method to construct reductionfree normalization functions. Starting from a reductionbased normalization function from a reduction semantics, i.e., the iteration of a onestep reduction function, we successively subject it to refocusing (i.e., deforestation of the inte ..."
Abstract

Cited by 28 (13 self)
 Add to MetaCart
(Show Context)
We document an operational method to construct reductionfree normalization functions. Starting from a reductionbased normalization function from a reduction semantics, i.e., the iteration of a onestep reduction function, we successively subject it to refocusing (i.e., deforestation of the intermediate successive terms in the reduction sequence), equational simplication, refunctionalization (i.e., the converse of defunctionalization), and directstyle transformation (i.e., the converse of the CPS transformation), ending with a reductionfree normalization function of the kind usually crafted by hand. We treat in detail four simple examples: calculating arithmetic expressions, recognizing Dyck words, normalizing lambdaterms with explicit substitutions and call/cc, and attening binary trees. The overall method builds on previous work by the author and his students on a syntactic correspondence between reduction semantics and abstract machines and on a functional correspondence between evaluators and abstract machines. The measure of success of these two correspondences is that each of the interderived semantic artifacts (i.e., manmade constructs) could plausibly have been written by hand, as is the actual case for several ones derived here.
Defunctionalized interpreters for programming languages
, 2008
"... This document illustrates how functional implementations of formal semantics (structural operational semantics, reduction semantics, smallstep and bigstep abstract machines, natural semantics, and denotational semantics) can be transformed into each other. These transformations were foreshadowed ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
(Show Context)
This document illustrates how functional implementations of formal semantics (structural operational semantics, reduction semantics, smallstep and bigstep abstract machines, natural semantics, and denotational semantics) can be transformed into each other. These transformations were foreshadowed by Reynolds in Definitional Interpreters for HigherOrder Programming Languages for functional implementations of denotational semantics, natural semantics, and bigstep abstract machines using closure conversion, CPS transformation, and defunctionalization. Over the last few years, the author and his students have further observed that machines are related using fusion by xedpoint promotion and that functional implementations of reduction semantics and of smallstep abstract machines are related using refocusing and transition
The arrow calculus
, 2008
"... Abstract. We introduce the arrow calculus, a metalanguage for manipulating Hughes’s arrows with close relations both to Moggi’s metalanguage for monads and to Paterson’s arrow notation. Arrows are classically defined by extending lambda calculus with three constructs satisfying nine (somewhat idiosy ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We introduce the arrow calculus, a metalanguage for manipulating Hughes’s arrows with close relations both to Moggi’s metalanguage for monads and to Paterson’s arrow notation. Arrows are classically defined by extending lambda calculus with three constructs satisfying nine (somewhat idiosyncratic) laws. In contrast, the arrow calculus adds four constructs satisfying five laws. Two of the constructs are arrow abstraction and application (satisfying beta and eta laws) and two correspond to unit and bind for monads (satisfying left unit, right unit, and associativity laws). The five laws were previously known to be sound; we show that they are also complete, and hence that the five laws may replace the nine. We give a translation from classic arrows into the arrow calculus to complement Paterson’s desugaring and show that the two translations form an equational correspondence in the sense of Sabry and Felleisen. We are also the first to publish formal type rules (which are unusual in that they require two contexts), which greatly aided our understanding of arrows. The first fruit of our new calculus is to reveal some redundancies in the classic formulation: the nine classic arrow laws can be reduced to eight, and the three additional classic arrow laws for arrows with apply can be reduced to two. The calculus has also been used to clarify the relationship between idioms, arrows and monads and as the inspiration for a categorical semantics of arrows. 1
Defunctionalized interpreters for callbyneed evaluation
 In Functional and Logic Programming
, 2010
"... Abstract. Starting from the standard callbyneed reduction for the λcalculus that is common to Ariola, Felleisen, Maraist, Odersky, and Wadler, we interderive a series of hygienic semantic artifacts: a reductionfree stateless abstract machine, a continuationpassing evaluation function, and wha ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Starting from the standard callbyneed reduction for the λcalculus that is common to Ariola, Felleisen, Maraist, Odersky, and Wadler, we interderive a series of hygienic semantic artifacts: a reductionfree stateless abstract machine, a continuationpassing evaluation function, and what appears to be the first heapless natural semantics for callbyneed evaluation. Furthermore we observe that a data structure and a judgment in this natural semantics are in defunctionalized form. The refunctionalized counterpart of this evaluation function is an extended direct semantics in the sense of Cartwright and Felleisen. Overall, the semantic artifacts presented here are simpler than many other such artifacts that have been independently worked out, and which require ingenuity, skill, and independent soundness proofs on a casebycase basis. They are also simpler to interderive because the interderivational tools (e.g., refocusing and defunctionalization) already exist. 1
Inductive Reasoning About Effectful Data Types
 In Proceedings of the ACM SIGPLAN International Conference on Functional Programming
, 2007
"... We present a pair of reasoning principles, definition and proof by rigid induction, which can be seen as proper generalizations of lazydatatype induction to monadic effects other than partiality. We further show how these principles can be integrated into logicalrelations arguments, and obtain as ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
We present a pair of reasoning principles, definition and proof by rigid induction, which can be seen as proper generalizations of lazydatatype induction to monadic effects other than partiality. We further show how these principles can be integrated into logicalrelations arguments, and obtain as a particular instance a general and principled proof that the successstream and failurecontinuation models of backtracking are equivalent. As another application, we present a monadic model of general search trees, not necessarily traversed depthfirst. The results are applicable to both lazy and eager languages, and we emphasize this by presenting most examples in both Haskell and SML.
A walk in the semantic park
 In PEPM11
, 2011
"... To celebrate the 20th anniversary of PEPM, we are inviting you to a walk in the semantic park and to interderive reductionbased and reductionfree negational normalization functions. ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
To celebrate the 20th anniversary of PEPM, we are inviting you to a walk in the semantic park and to interderive reductionbased and reductionfree negational normalization functions.
Towards Compatible and Interderivable Semantic Specifications for the Scheme Programming Language, Part I: Denotational Semantics, Natural Semantics, and Abstract Machines
, 2008
"... We derive two bigstep abstract machines, a natural semantics, and the valuation function of a denotational semantics based on the smallstep abstract machine for Core Scheme presented by Clinger at PLDI’98. Starting from a functional implementation of this smallstep abstract machine, (1) we fuse i ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We derive two bigstep abstract machines, a natural semantics, and the valuation function of a denotational semantics based on the smallstep abstract machine for Core Scheme presented by Clinger at PLDI’98. Starting from a functional implementation of this smallstep abstract machine, (1) we fuse its transition function with its driver loop, obtaining the functional implementation of a bigstep abstract machine; (2) we adjust this bigstep abstract machine so that it is in defunctionalized form, obtaining the functional implementation of a second bigstep abstract machine; (3) we refunctionalize this adjusted abstract machine, obtaining the functional implementation of a natural semantics in continuation style; and (4) we closureunconvert this natural semantics, obtaining a compositional continuationpassing evaluation function which we identify as the functional implementation of a denotational semantics in continuation style. We then compare this valuation function with that of Clinger’s original denotational semantics of Scheme.
A Systematic Approach to Delimited Control with Multiple Prompts
"... Abstract. We formalize delimited control with multiple prompts, in the style of Parigot’s λμcalculus, through a series of incremental extensions by starting with the pure λcalculus. Each language inherits the semantics and reduction theory of its parent, giving a systematic way to describe each le ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We formalize delimited control with multiple prompts, in the style of Parigot’s λμcalculus, through a series of incremental extensions by starting with the pure λcalculus. Each language inherits the semantics and reduction theory of its parent, giving a systematic way to describe each level of control.