Results 1 - 10
of
294
The Revised Report on the Syntactic Theories of Sequential Control and State
- THEORETICAL COMPUTER SCIENCE
, 1992
"... The syntactic theories of control and state are conservative extensions of the v -calculus for equational reasoning about imperative programming facilities in higher-order languages. Unlike the simple v -calculus, the extended theories are mixtures of equivalence relations and compatible congruen ..."
Abstract
-
Cited by 292 (36 self)
- Add to MetaCart
The syntactic theories of control and state are conservative extensions of the v -calculus for equational reasoning about imperative programming facilities in higher-order languages. Unlike the simple v -calculus, the extended theories are mixtures of equivalence relations and compatible congruence relations on the term language, which significantly complicates the reasoning process. In this paper we develop fully compatible equational theories of the same imperative higher-order programming languages. The new theories subsume the original calculi of control and state and satisfy the usual Church-Rosser and Standardization Theorems. With the new calculi, equational reasoning about imperative programs becomes as simple as reasoning about functional programs.
A New Deconstructive Logic: Linear Logic
, 1995
"... The main concern of this paper is the design of a noetherian and confluent normalization for LK 2 (that is, classical second order predicate logic presented as a sequent calculus). The method we present is powerful: since it allows us to recover as fragments formalisms as seemingly different a ..."
Abstract
-
Cited by 127 (11 self)
- Add to MetaCart
The main concern of this paper is the design of a noetherian and confluent normalization for LK 2 (that is, classical second order predicate logic presented as a sequent calculus). The method we present is powerful: since it allows us to recover as fragments formalisms as seemingly different as Girard's LC and Parigot's , FD ([9, 11, 27, 31]), delineates other viable systems as well, and gives means to extend the Krivine/Leivant paradigm of `programming-with-proofs' ([22, 23]) to classical logic; it is painless: since we reduce strong normalization and confluence to the same properties for linear logic (for non-additive proof nets, to be precise) using appropriate embeddings (so-called decorations); it is unifying: it organizes known solutions in a simple pattern that makes apparent the how and why of their making. A comparison of our method to that of embedding LK into LJ (intuitionistic sequent calculus) brings to the fore the latter's defects for these `deconstructi...
The Design and Implementation of Typed Scheme
, 2008
"... When scripts in untyped languages grow into large programs, maintaining them becomes difficult. A lack of types in typical scripting languages means that programmers must (re)discover critical pieces of design information every time they wish to change a program. This analysis step both slows down t ..."
Abstract
-
Cited by 96 (16 self)
- Add to MetaCart
When scripts in untyped languages grow into large programs, maintaining them becomes difficult. A lack of types in typical scripting languages means that programmers must (re)discover critical pieces of design information every time they wish to change a program. This analysis step both slows down the maintenance process and may even introduce mistakes due to the violation of undiscovered invariants. This paper presents Typed Scheme, an explicitly typed extension of an untyped scripting language. Its type system is based on the novel notion of occurrence typing, which we formalize and mechanically prove sound. The implementation of Typed Scheme additionally borrows elements from a range of approaches, including recursive types, true unions and subtyping, plus polymorphism combined with a modicum of local inference. Initial experiments with the implementation suggest that Typed Scheme naturally accommodates the programming style of the underlying scripting language, at least for the first few thousand lines of ported code.
A Curry-Howard foundation for functional computation with control
- In Proceedings of ACM SIGPLAN-SIGACT Symposium on Principle of Programming Languages
, 1997
"... We introduce the type theory ¯ v , a call-by-value variant of Parigot's ¯-calculus, as a Curry-Howard representation theory of classical propositional proofs. The associated rewrite system is Church-Rosser and strongly normalizing, and definitional equality of the type theory is consistent, com ..."
Abstract
-
Cited by 93 (3 self)
- Add to MetaCart
(Show Context)
We introduce the type theory ¯ v , a call-by-value variant of Parigot's ¯-calculus, as a Curry-Howard representation theory of classical propositional proofs. The associated rewrite system is Church-Rosser and strongly normalizing, and definitional equality of the type theory is consistent, compatible with cut, congruent and decidable. The attendant call-by-value programming language ¯pcf v is obtained from ¯ v by augmenting it by basic arithmetic, conditionals and fixpoints. We study the behavioural properties of ¯pcf v and show that, though simple, it is a very general language for functional computation with control: it can express all the main control constructs such as exceptions and first-class continuations. Proof-theoretically the dual ¯ v -constructs of naming and ¯-abstraction witness the introduction and elimination rules of absurdity respectively. Computationally they give succinct expression to a kind of generic (forward) "jump" operator, which may be regarded as a unif...
A Generic Account of Continuation-Passing Styles
- Proceedings of the Twenty-first Annual ACM Symposium on Principles of Programming Languages
, 1994
"... We unify previous work on the continuation-passing style (CPS) transformations in a generic framework based on Moggi's computational meta-language. This framework is used to obtain CPS transformations for a variety of evaluation strategies and to characterize the corresponding administrative re ..."
Abstract
-
Cited by 91 (35 self)
- Add to MetaCart
(Show Context)
We unify previous work on the continuation-passing style (CPS) transformations in a generic framework based on Moggi's computational meta-language. This framework is used to obtain CPS transformations for a variety of evaluation strategies and to characterize the corresponding administrative reductions and inverse transformations. We establish generic formal connections between operational semantics and equational theories. Formal properties of transformations for specific evaluation orders follow as corollaries. Essentially, we factor transformations through Moggi's computational meta-language. Mapping -terms into the meta-language captures computational properties (e.g., partiality, strictness) and evaluation order explicitly in both the term and the type structure of the meta-language. The CPS transformation is then obtained by applying a generic transformation from terms and types in the meta-language to CPS terms and types, based on a typed term representation of the continuation ...
Representing control: a study of the CPS transformation
, 1992
"... This paper investigates the transformation of v -terms into continuation-passing style (CPS). We show that by appropriate j-expansion of Fischer and Plotkin's two-pass equational specification of the CPS transform, we can obtain a static and context-free separation of the result terms into ..."
Abstract
-
Cited by 90 (8 self)
- Add to MetaCart
This paper investigates the transformation of v -terms into continuation-passing style (CPS). We show that by appropriate j-expansion of Fischer and Plotkin's two-pass equational specification of the CPS transform, we can obtain a static and context-free separation of the result terms into "essential" and "administrative" constructs. Interpreting the former as syntax builders and the latter as directly executable code, we obtain a simple and efficient one-pass transformation algorithm, easily extended to conditional expressions, recursive definitions, and similar constructs. This new transformation algorithm leads to a simpler proof of Plotkin's simulation and indifference results. Further we show how CPS-based control operators similar to but more general than Scheme's call/cc can be naturally accommodated by the new transformation algorithm. To demonstrate the expressive power of these operators, we use them to present an equivalent but even more concise formulation of t...
Explicit Polymorphism and CPS Conversion
- IN TWENTIETH ACM SYMPOSIUM ON PRINCIPLES OF PROGRAMMING LANGUAGES
, 1992
"... We study the typing properties of CPS conversion for an extension of F ! with control operators. Two classes of evaluation strategies are considered, each with call-by-name and call-by-value variants. Under the "standard" strategies, constructor abstractions are values, and constructor app ..."
Abstract
-
Cited by 60 (9 self)
- Add to MetaCart
We study the typing properties of CPS conversion for an extension of F ! with control operators. Two classes of evaluation strategies are considered, each with call-by-name and call-by-value variants. Under the "standard" strategies, constructor abstractions are values, and constructor applications can lead to non-trivial control effects. In contrast, the "ML-like" strategies evaluate beneath constructor abstractions, reflecting the usual interpretation of programs in languages based on implicit polymorphism. Three continuation passing style sub-languages are considered, one on which the standard strategies coincide, one on which the ML-like strategies coincide, and one on which all the strategies coincide. Compositional, type-preserving CPS transformation algorithms are given for the standard strategies, resulting in terms on which all evaluation strategies coincide. This has as a corollary the soundness and termination of well-typed programs under the standard evaluation strategies. A similar result is obtained for the ML-like call-by-name strategy. In contrast, such results are obtained for the call-by value ML-like strategy only for a restricted sub-language in which constructor abstractions are limited to values.
Explaining Crossover and Superiority as Left-to-Right Evaluation
- LINGUISTICS AND PHILOSOPHY
, 2006
"... We present a general theory of scope and binding in which both crossover and superiority violations are ruled out by one key assumption: that natural language expressions are normally evaluated (processed) from left to right. Our theory is an extension of Shan’s (2002) account of multiple-wh questi ..."
Abstract
-
Cited by 59 (14 self)
- Add to MetaCart
We present a general theory of scope and binding in which both crossover and superiority violations are ruled out by one key assumption: that natural language expressions are normally evaluated (processed) from left to right. Our theory is an extension of Shan’s (2002) account of multiple-wh questions, combining continuations (Barker, 2002) and dynamic type-shifting. Like other continuation-based analyses, but unlike most other treatments of crossover or superiority, our analysis is directly compositional (in the sense of, e.g., Jacobson, 1999). In particular, it does not postulate a level of Logical Form or any other representation distinct from surface syntax. One advantage of using continuations is that they are the standard tool for modeling order-ofevaluation in programming languages. This provides us with a natural and independently motivated characterization of what it means to evaluate expressions from left to right. We give a combinatory categorial grammar that models the syntax and the semantics of quantifier scope and wh-question formation. It allows quantificational binding but not crossover, in-situ wh but not superiority violations. In addition, the analysis automatically accounts for a variety of sentence types involving binding in the presence of pied piping, including reconstruction cases such as Whose criticism of hisi mother did each personi resent?
Typed lambda-calculus in classical Zermelo-Fraenkel set theory
- ARCHIVE OF MATHEMATICAL LOGIC
, 2001
"... In this paper, we develop a system of typed lambda-calculus for the Zermelo-Fraenkel set theory, in the framework of classical logic. The first, and the simplest system of typed lambda-calculus is the system of simple types, which uses the intuitionistic propositional calculus, with the only connect ..."
Abstract
-
Cited by 47 (12 self)
- Add to MetaCart
In this paper, we develop a system of typed lambda-calculus for the Zermelo-Fraenkel set theory, in the framework of classical logic. The first, and the simplest system of typed lambda-calculus is the system of simple types, which uses the intuitionistic propositional calculus, with the only connective #. It is very important, because the well known Curry-Howard correspondence between proofs and programs was originally discovered with it, and because it enjoys the normalization property : every typed term is strongly normalizable. It was extended to second order intuitionistic logic, in 1970, by J.-Y. Girard[4], under the name of system F, still with the normalization property. More recently, in 1990, the Curry-Howard correspondence was extended to classical logic, following Felleisen and Griffin [6] who discovered that the law of Peirce corresponds to control instructions in functional programming