Results 11  20
of
100
The essence of etaexpansion in partial evaluation
 LISP AND SYMBOLIC COMPUTATION
, 1995
"... Selective etaexpansion is a powerful "bindingtime improvement", i.e., a sourceprogram modification that makes a partial evaluator yield better results. But like most bindingtime improvements, the exact problem it solves and the reason why have not been formalized and are only understoo ..."
Abstract

Cited by 34 (11 self)
 Add to MetaCart
Selective etaexpansion is a powerful "bindingtime improvement", i.e., a sourceprogram modification that makes a partial evaluator yield better results. But like most bindingtime improvements, the exact problem it solves and the reason why have not been formalized and are only understood by few. In this paper, we describe the problem and the effect of etaredexes in terms of monovariant bindingtime propagation: etaredexes preserve the static data ow of a source program by interfacing static higherorder values in dynamic contexts and dynamic higherorder values in static contexts. They contribute to two distinct bindingtime improvements. We present two extensions of Gomard's monovariant bindingtime analysis for the purecalculus. Our extensions annotate and etaexpandterms. The rst one etaexpands static higherorder values in dynamic contexts. The second also etaexpands dynamic higherorder values in static contexts. As a significant application, we show that our first bindingtime analysis suffices to reformulate the traditional formulation of a CPS transformation into a modern onepass CPS transformer. This bindingtime improvement is known, but it is still left unexplained in contemporary literature, e.g., about "cpsbased" partial evaluation. We also outline the counterpart of etaexpansion for partially static data structures.
Screamer: A Portable Efficient Implementation of Nondeterministic Common Lisp
 University of Pennsylvania, Institute for
, 1993
"... Nondeterministic Lisp is a simple extension of Lisp which provides automatic backtracking. Nondeterminism allows concise description of many search tasks which form the basis of much AI research. This paper discusses Screamer, an efficient implementation of nondeterministic Lisp as a fully portab ..."
Abstract

Cited by 33 (5 self)
 Add to MetaCart
Nondeterministic Lisp is a simple extension of Lisp which provides automatic backtracking. Nondeterminism allows concise description of many search tasks which form the basis of much AI research. This paper discusses Screamer, an efficient implementation of nondeterministic Lisp as a fully portable extension of Common Lisp. In this paper we present the basic nondeterministic Lisp constructs, motivate the utility of the language via numerous short examples, and discuss the compilation techniques. Supported in part by an AT&T Bell Laboratories Ph.D. scholarship to the author, by a Presidential Young Investigator Award to Professor Robert C. Berwick under National Science Foundation Grant DCR 85552543, by a grant from the Siemens Corporation, and by the Kapor Family Foundation. Also supported in part by ARO grant DAAL 0389C0031, by DARPA grant N0001490J1863, by NSF grant IRI 90 16592, and by Ben Franklin grant 91S.3078C1 y Supported in part by the Advanced Resea...
A rational deconstruction of Landin’s SECD machine
 Implementation and Application of Functional Languages, 16th International Workshop, IFL’04, number 3474 in Lecture Notes in Computer Science
, 2004
"... Abstract. Landin’s SECD machine was the first abstract machine for applicative expressions, i.e., functional programs. Landin’s J operator was the first control operator for functional languages, and was specified by an extension of the SECD machine. We present a family of evaluation functions corre ..."
Abstract

Cited by 33 (20 self)
 Add to MetaCart
(Show Context)
Abstract. Landin’s SECD machine was the first abstract machine for applicative expressions, i.e., functional programs. Landin’s J operator was the first control operator for functional languages, and was specified by an extension of the SECD machine. We present a family of evaluation functions corresponding to this extension of the SECD machine, using a series of elementary transformations (transformation into continuationpassing style (CPS) and defunctionalization, chiefly) and their left inverses (transformation into direct style and refunctionalization). To this end, we modernize the SECD machine into a bisimilar one that operates in lockstep with the original one but that (1) does not use a data stack and (2) uses the callersave rather than the calleesave convention for environments. We also identify that the dump component of the SECD machine is managed in a calleesave way. The callersave counterpart of the modernized SECD machine precisely corresponds to Thielecke’s doublebarrelled continuations and to Felleisen’s encoding of J in terms of call/cc. We then variously characterize the J operator in terms of CPS and in terms of delimitedcontrol operators in the CPS hierarchy. As a byproduct, we also present several reduction semantics for applicative expressions
The Impact of the Lambda Calculus in Logic and Computer Science
 BULLETIN OF SYMBOLIC LOGIC
, 1997
"... One of the most important contributions of A. Church to logic is his invention of the lambda calculus. We present the genesis of this theory and its two major areas of application: the representation of computations and the resulting functional programming languages on the one hand and the represent ..."
Abstract

Cited by 28 (1 self)
 Add to MetaCart
One of the most important contributions of A. Church to logic is his invention of the lambda calculus. We present the genesis of this theory and its two major areas of application: the representation of computations and the resulting functional programming languages on the one hand and the representation of reasoning and the resulting systems of computer mathematics on the other hand.
Syntactic Accidents in Program Analysis: On the Impact of the CPS Transformation
 Journal of Functional Programming
, 2000
"... Our results formalize and confirm a folklore theorem about traditional bindingtime analysis, namely that CPS has a positive effect on binding times. What may be more surprising is that the benefit does not arise from a standard refinement of program analysis, as, for instance, duplicating continuati ..."
Abstract

Cited by 28 (9 self)
 Add to MetaCart
Our results formalize and confirm a folklore theorem about traditional bindingtime analysis, namely that CPS has a positive effect on binding times. What may be more surprising is that the benefit does not arise from a standard refinement of program analysis, as, for instance, duplicating continuations.
From reductionbased to reductionfree normalization
 Proceedings of the Fourth International Workshop on Reduction Strategies in Rewriting and Programming (WRS'04
, 2004
"... We document an operational method to construct reductionfree normalization functions. Starting from a reductionbased normalization function from a reduction semantics, i.e., the iteration of a onestep reduction function, we successively subject it to refocusing (i.e., deforestation of the inte ..."
Abstract

Cited by 28 (13 self)
 Add to MetaCart
(Show Context)
We document an operational method to construct reductionfree normalization functions. Starting from a reductionbased normalization function from a reduction semantics, i.e., the iteration of a onestep reduction function, we successively subject it to refocusing (i.e., deforestation of the intermediate successive terms in the reduction sequence), equational simplication, refunctionalization (i.e., the converse of defunctionalization), and directstyle transformation (i.e., the converse of the CPS transformation), ending with a reductionfree normalization function of the kind usually crafted by hand. We treat in detail four simple examples: calculating arithmetic expressions, recognizing Dyck words, normalizing lambdaterms with explicit substitutions and call/cc, and attening binary trees. The overall method builds on previous work by the author and his students on a syntactic correspondence between reduction semantics and abstract machines and on a functional correspondence between evaluators and abstract machines. The measure of success of these two correspondences is that each of the interderived semantic artifacts (i.e., manmade constructs) could plausibly have been written by hand, as is the actual case for several ones derived here.
Thunks and the λcalculus
 IN THE JOURNAL OF FUNCTIONAL PROGRAMMING. RS976 OLIVIER DANVY AND ULRIK
, 1997
"... Plotkin, in his seminal article Callbyname, callbyvalue and the λcalculus, formalized evaluation strategies and simulations using operational semantics and continuations. In particular, he showed how callbyname evaluation could be simulated under callbyvalue evaluation and vice versa. Si ..."
Abstract

Cited by 27 (10 self)
 Add to MetaCart
(Show Context)
Plotkin, in his seminal article Callbyname, callbyvalue and the λcalculus, formalized evaluation strategies and simulations using operational semantics and continuations. In particular, he showed how callbyname evaluation could be simulated under callbyvalue evaluation and vice versa. Since Algol 60, however, callbyname is both implemented and simulated with thunks rather than with continuations. We recast
Defunctionalized interpreters for programming languages
, 2008
"... This document illustrates how functional implementations of formal semantics (structural operational semantics, reduction semantics, smallstep and bigstep abstract machines, natural semantics, and denotational semantics) can be transformed into each other. These transformations were foreshadowed ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
(Show Context)
This document illustrates how functional implementations of formal semantics (structural operational semantics, reduction semantics, smallstep and bigstep abstract machines, natural semantics, and denotational semantics) can be transformed into each other. These transformations were foreshadowed by Reynolds in Definitional Interpreters for HigherOrder Programming Languages for functional implementations of denotational semantics, natural semantics, and bigstep abstract machines using closure conversion, CPS transformation, and defunctionalization. Over the last few years, the author and his students have further observed that machines are related using fusion by xedpoint promotion and that functional implementations of reduction semantics and of smallstep abstract machines are related using refocusing and transition
The Occurrence of Continuation Parameters in CPS Terms
, 1995
"... We prove an occurrence property about formal parameters of continuations in ContinuationPassing Style (CPS) terms that have been automatically produced by CPS transformation of pure, callbyvalue terms. Essentially, parameters of continuations obey a stacklike discipline. This property was intro ..."
Abstract

Cited by 25 (18 self)
 Add to MetaCart
(Show Context)
We prove an occurrence property about formal parameters of continuations in ContinuationPassing Style (CPS) terms that have been automatically produced by CPS transformation of pure, callbyvalue terms. Essentially, parameters of continuations obey a stacklike discipline. This property was introduced, but not formally proven, in an earlier work on the DirectStyle transformation (the inverse of the CPS transformation). The proof has been implemented in Elf, a constraint logic programming language based on the logical framework LF. In fact, it was the implementation that inspired the proof. Thus this note also presents a case study of machineassisted proof discovery. All the programs are available in ( ftp.daimi.aau.dk:pub/danvy/Programs/danvypfenningElf93.tar.gz ftp.cs.cmu.edu:user/fp/papers/cpsocc95.tar.gz Most of the research reported here was carried out while the first author visited Carnegie Mellon University in the Spring of 1993. Current address: Olivier Danvy, Ny Munkeg...
An operational foundation for delimited continuations
 in the CPS hierarchy. Logical Methods in Computer Science
, 2005
"... ..."