Results 1 
7 of
7
Reasoning about Programs in ContinuationPassing Style
 Lisp and Symbolic Computation
"... Plotkin's v calculus for callbyvalue programs is weaker than the fij calculus for the same programs in continuationpassing style (CPS). To identify the callby value axioms that correspond to fij on CPS terms, we define a new CPS transformation and an inverse mapping, both of which are interes ..."
Abstract

Cited by 161 (13 self)
 Add to MetaCart
Plotkin's v calculus for callbyvalue programs is weaker than the fij calculus for the same programs in continuationpassing style (CPS). To identify the callby value axioms that correspond to fij on CPS terms, we define a new CPS transformation and an inverse mapping, both of which are interesting in their own right. Using the new CPS transformation, we determine the precise language of CPS terms closed under fijtransformations, as well as the callbyvalue axioms that correspond to the socalled administrative fijreductions on CPS terms. Using the inverse mapping, we map the remaining fi and j equalities on CPS terms to axioms on callbyvalue terms. On the pure (constant free) set ofterms, the resulting set of axioms is equivalent to Moggi's computational calculus. If the callbyvalue language includes the control operators abort and callwithcurrentcontinuation, the axioms are equivalent to an extension of Felleisen et al.'s vCcalculus and to the equational subtheory of Talcott's logic IOCC. Contents 1 Compiling with and without Continuations 4 2 : Calculi and Semantics 7 3 The Origins and Practice of CPS 10 3.1 The Original Encoding : : : : : : : : : : : : : : : : : : : : : 10 3.2 The Universe of CPS Terms : : : : : : : : : : : : : : : : : : 11 4 A Compacting CPS Transformation 13
Smartest Recompilation
 In ACM Symp. on Principles of Programming Languages
, 1993
"... To separately compile a program module in traditional staticallytyped languages, one has to manually write down an import interface which explicitly specifies all the external symbols referenced in the module. Whenever the definitions of these external symbols are changed, the module has to be reco ..."
Abstract

Cited by 61 (3 self)
 Add to MetaCart
To separately compile a program module in traditional staticallytyped languages, one has to manually write down an import interface which explicitly specifies all the external symbols referenced in the module. Whenever the definitions of these external symbols are changed, the module has to be recompiled. In this paper, we present an algorithm which can automatically infer the "minimum" import interface for any module in languages based on the DamasMilner type discipline (e.g., ML). By "minimum", we mean that the interface specifies a set of assumptions (for external symbols) that are just enough to make the module typecheck and compile. By compiling each module using its "minimum" import interface, we get a separate compilation method that can achieve the following optimal property: A compilation unit never needs to be recompiled unless its own implementation changes.
An Implementation of Standard ML Modules
 In ACM Conf. on Lisp and Functional Programming
, 1988
"... Standard ML includes a set of module constructs that support programming in the large. These constructs extend ML's basic polymorphic type system by introducing the dependent types of Martin Lo"f's Intuitionistic Type Theory. This paper discusses the problems involved in implementing Standard ML's m ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
Standard ML includes a set of module constructs that support programming in the large. These constructs extend ML's basic polymorphic type system by introducing the dependent types of Martin Lo"f's Intuitionistic Type Theory. This paper discusses the problems involved in implementing Standard ML's modules and describes a practical, efficient solution to these problems. The representations and algorithms of this implementation were inspired by a detailed formal semantics of Standard ML developed by Milner, Tofte, and Harper. The implementation is part of a new Standard ML compiler that is written in Standard ML using the module system. March 11, An Implementation of Standard ML Modules David MacQueen AT&T Bell Laboratories Murray Hill, NJ 07974 1. Introduction An important part of the revision of ML that led to the Standard ML language was the inclusion of module facilities for the support of "programming in the large." The design of these facilities went through several versions [...
The Formal Relationship Between Direct and ContinuationPassing Style Optimizing Compilers: A Synthesis of Two Paradigms
, 1994
"... Compilers for higherorder programming languages like Scheme, ML, and Lisp can be broadly characterized as either "direct compilers" or "continuationpassing style (CPS) compilers", depending on their main intermediate representation. Our central result is a precise correspondence between the two co ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
Compilers for higherorder programming languages like Scheme, ML, and Lisp can be broadly characterized as either "direct compilers" or "continuationpassing style (CPS) compilers", depending on their main intermediate representation. Our central result is a precise correspondence between the two compilation strategies. Starting from
An Open Ended Data Representation Model for EuLisp
 In LFP '88  ACM Symposium on Lisp and Functional Programming
, 1988
"... The goal of this paper is to describe an openended type system for Lisp with explicit and full control of bitlevel data representations. This description uses a reflective architecture based on a metatype facility. This lowlevel formalism solves the problem of an harmonious design of a class taxo ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
The goal of this paper is to describe an openended type system for Lisp with explicit and full control of bitlevel data representations. This description uses a reflective architecture based on a metatype facility. This lowlevel formalism solves the problem of an harmonious design of a class taxonomy inside a type system. A prototype for this framework has been written in LeLisp and is used to build the integrated type and object systems of the EULISPproposal. Introduction Since the first circular definition of Lisp [8], selfdescriptions have became more and more precise [11, 15] until sufficient to be now valuable implementation tools [2, 7, 14, 12]. But if control structures, variables scope or allocation extents are well studied aspects, data structures choice presents much difficulty. Accurate and powerful tools are still needed to (i) design bitlevel representations for all Lisp primitive data types (conscells, arrays : : : ) or more hidden embedded data structures (stackf...
Specification Framework for Data Aggregates Langages
, 1990
"... The representation of data aggregates is fundamentally made of concatenation and/or repetition of smaller representations. These structurations may be arbitrarily composed to form complex aggregates of primitive representations such as characters, integers or pointers. Knowing its structuration allo ..."
Abstract
 Add to MetaCart
The representation of data aggregates is fundamentally made of concatenation and/or repetition of smaller representations. These structurations may be arbitrarily composed to form complex aggregates of primitive representations such as characters, integers or pointers. Knowing its structuration allows the interpretation of a sequence of contiguous bits as an instance of a type represented by the given structuration. This paper presents a formalization of data representations, it identifies and analyses the features which describe primitive representations as well as representation structurers which allow to compose data representations. These features can be very naturally expressed in terms of objects, classes and methods. Our model makes structurations fully explicit: some computations can be performed on them. We will show how to derive a general data inspector and a garbage collector from them. We also present an extension of the subclass concept based on concatenation of represent...
The Essence of Compiling with Continuations
"... In order to simplify the compilation process, many compilers for higherorder languages use the continuationpassing style (CPS) transformation in a first phase to generate an intermediate representation of the source program. The salient aspect of this intermediate form is that all procedures take a ..."
Abstract
 Add to MetaCart
In order to simplify the compilation process, many compilers for higherorder languages use the continuationpassing style (CPS) transformation in a first phase to generate an intermediate representation of the source program. The salient aspect of this intermediate form is that all procedures take an argument that represents the rest of the computation (the "continuation"). Since the naive CPS transformation considerably increases the size of programs, CPS compilers perform reductions to produce a more compact intermediate representation. Although often implemented as a part of the CPS transformation, this step is conceptually a second phase. Finally, code generators for typical CPS compilers treat continuations specially in order to optimize the interpretation of continuation parameters. A thorough analysis of the abstract machine for CPS terms shows that the actions of the code generator invert the naive CPS translation step. Put differently, the combined effect of the three phases is equivalent to a sourcetosource transformation that simulates the compaction phase. Thus, fully developed CPS compilers do not need to employ the CPS transformation but can achieve the same results with a simple sourcelevel transformation. 1 Compiling with Continuations A number of prominent compilers for applicative higherorder programming languages use the language of continuationpassing style (CPS) terms as their intermediate representation for programs [2, 14, 18, 19]. This strategy apparently offers two major advantages. First, Plotkin [16] showed that the value calculus based on