Results 11  20
of
147
Three approaches to type structure
 In International Joint Conference on Theory and Practice of Software Development (TAPSOFT
, 1985
"... ABSTRACT We examine three disparate views of the type structure of]programming languages: Milner's type deduction system and polymorphic ~[e_.!t construct, the theory of subtypes and generic operators, and the polymorphic or secondorder typed lambda calculus. These approaches are illustrated w ..."
Abstract

Cited by 77 (0 self)
 Add to MetaCart
(Show Context)
ABSTRACT We examine three disparate views of the type structure of]programming languages: Milner's type deduction system and polymorphic ~[e_.!t construct, the theory of subtypes and generic operators, and the polymorphic or secondorder typed lambda calculus. These approaches are illustrated with a functional language including product, sum and list constructors. The syntactic behavio ~ of types is formalized with ~ype inference rules, bus their semantics is treated intuitively. I.
An observationally complete program logic for imperative higherorder functions
 In Proc. LICS’05
, 2005
"... Abstract. We propose a simple compositional program logic for an imperative extension of callbyvalue PCF, built on Hoare logic and our preceding work on program logics for pure higherorder functions. A systematic use of names and operations on them allows precise and general description of comple ..."
Abstract

Cited by 45 (13 self)
 Add to MetaCart
(Show Context)
Abstract. We propose a simple compositional program logic for an imperative extension of callbyvalue PCF, built on Hoare logic and our preceding work on program logics for pure higherorder functions. A systematic use of names and operations on them allows precise and general description of complex higherorder imperative behaviour. The proof rules of the logic exactly follow the syntax of the language and can cleanly embed, justify and extend the standard proof rules for total correctness of Hoare logic. The logic offers a foundation for general treatment of aliasing and local state on its basis, with minimal extensions. After establishing soundness, we prove that valid assertions for programs completely characterise their behaviour up to observational congruence, which is proved using a variant of finite canonical forms. The use of the logic is illustrated through reasoning examples which are hard to assert and infer using existing program logics.
Java Jr.: A fully abstract trace semantics for a core Java language
 In ESOP, volume 3444 of LNCS
, 2005
"... Abstract. We introduce an expressive yet semantically clean core Javalike language, Java Jr., and provide it with a formal operational semantics based on traces of observable actions which represent interaction across package boundaries. A detailed example based on the Observer Pattern is used to d ..."
Abstract

Cited by 37 (0 self)
 Add to MetaCart
(Show Context)
Abstract. We introduce an expressive yet semantically clean core Javalike language, Java Jr., and provide it with a formal operational semantics based on traces of observable actions which represent interaction across package boundaries. A detailed example based on the Observer Pattern is used to demonstrate the intuitive character of the semantic model. We also show that our semantic trace equivalence is fullyabstract with respect to a natural notion of testing equivalence for object systems. This is the first such result for a full classbased OOlanguage with inheritance. 1
Games and full abstraction for nondeterministic languages
, 1999
"... Abstract Nondeterminism is a pervasive phenomenon in computation. Often it arises as an emergent property of a complex system, typically as the result of contention for access to shared resources. In such circumstances, we cannot always know, in advance, exactly what will happen. In other circumstan ..."
Abstract

Cited by 36 (3 self)
 Add to MetaCart
Abstract Nondeterminism is a pervasive phenomenon in computation. Often it arises as an emergent property of a complex system, typically as the result of contention for access to shared resources. In such circumstances, we cannot always know, in advance, exactly what will happen. In other circumstances, nondeterminism is explicitly introduced as a means of abstracting away from implementation details such as precise command scheduling and control flow. However, the kind of behaviours exhibited by nondeterministic computations can be extremely subtle in comparison to those of their deterministic counterparts and reasoning about such programs is notoriously tricky as a result. It is therefore important to develop semantic tools to improve our understanding of, and aid our reasoning about, such nondeterministic programs. In this thesis, we extend the framework of game semantics to encompass nondeterministic computation. Game semantics is a relatively recent development in denotational semantics; its main novelty is that it views a computation not as a static entity, but rather as a dynamic process of interaction. This perspective makes the theory wellsuited to modelling many aspects of computational processes: the original use of game semantics in modelling the simple functional language PCF has subsequently been extended to handle more complex control structures such as references and continuations.
A rational deconstruction of Landin’s SECD machine
 Implementation and Application of Functional Languages, 16th International Workshop, IFL’04, number 3474 in Lecture Notes in Computer Science
, 2004
"... Abstract. Landin’s SECD machine was the first abstract machine for applicative expressions, i.e., functional programs. Landin’s J operator was the first control operator for functional languages, and was specified by an extension of the SECD machine. We present a family of evaluation functions corre ..."
Abstract

Cited by 33 (20 self)
 Add to MetaCart
(Show Context)
Abstract. Landin’s SECD machine was the first abstract machine for applicative expressions, i.e., functional programs. Landin’s J operator was the first control operator for functional languages, and was specified by an extension of the SECD machine. We present a family of evaluation functions corresponding to this extension of the SECD machine, using a series of elementary transformations (transformation into continuationpassing style (CPS) and defunctionalization, chiefly) and their left inverses (transformation into direct style and refunctionalization). To this end, we modernize the SECD machine into a bisimilar one that operates in lockstep with the original one but that (1) does not use a data stack and (2) uses the callersave rather than the calleesave convention for environments. We also identify that the dump component of the SECD machine is managed in a calleesave way. The callersave counterpart of the modernized SECD machine precisely corresponds to Thielecke’s doublebarrelled continuations and to Felleisen’s encoding of J in terms of call/cc. We then variously characterize the J operator in terms of CPS and in terms of delimitedcontrol operators in the CPS hierarchy. As a byproduct, we also present several reduction semantics for applicative expressions
References, Local Variables and Operational Reasoning
 In Seventh Annual Symposium on Logic in Computer Science
, 1992
"... this paper we regard the following as synonyms: references, program variables, pointers, locations, and unary cells) to a programming language complicates life. Adding them to the simply typed lambda calculus causes the failure of most of the nice mathematical properties and some of the more basic r ..."
Abstract

Cited by 32 (4 self)
 Add to MetaCart
(Show Context)
this paper we regard the following as synonyms: references, program variables, pointers, locations, and unary cells) to a programming language complicates life. Adding them to the simply typed lambda calculus causes the failure of most of the nice mathematical properties and some of the more basic rules (such as j). For example strong normalization fails since it is possible, for each provably nonempty function type, to construct a Y combinator for that type. References also interact unpleasantly with polymorphism [34, 35]. They are also troublesome from a denotational point of view as illustrated by the lack of fully abstract models. For example, in [22] Meyer and Sieber give a series of examples of programs that are operationally equivalent (according to the intended semantics of blockstructured Algollike programs) but which are not given equivalent denotations in traditional denotational semantics. They propose various modifications to the denotational semantics which solve some of these discrepancies, but not all. In [27, 26] a denotational semantics that overcomes some of these problems is presented. However variations on the seventh example remain problematic. Since numerous proof systems for Algol are sound for the denotational models in question, [8, 7, 32, 28, 16, 27, 26], these equivalences, if expressible, must be independent of these systems. The problem which motivated Meyer and Sieber's paper, [22], was to provide mathematical justification for the informal but convincing proofs of the operational equivalence of their examples. In this paper we approach the same problem, but from an operational rather than denotational perspective. This paper accomplishes two goals. Firstly, we present the firstorder part of a new logic for reasoning about programs....
A Compositional Logic for Polymorphic HigherOrder Functions
 PPDP'04
, 2004
"... This paper introduces a compositional program logic for higherorder polymorphic functions and standard data types. The logic enables us to reason about observable properties of polymorphic programs starting from those of their constituents. Just as types attached to programs offer information on the ..."
Abstract

Cited by 28 (11 self)
 Add to MetaCart
This paper introduces a compositional program logic for higherorder polymorphic functions and standard data types. The logic enables us to reason about observable properties of polymorphic programs starting from those of their constituents. Just as types attached to programs offer information on their composability so as to guarantee basic safety of composite programs, formulae of the proposed logic attached to programs offer information on their composability so as to guarantee finegrained behavioural properties of polymorphic programs. The central feature of the logic is a systematic usage of names and operations on them, whose origin is in the logics for typed πcalculi. The paper introduces the program logic and its proof rules and illustrates their usage by nontrivial reasoning examples, taking a prototypical callbyvalue functional language with impredicative polymorphism and recursive types as a target language.
TinkerType: a language for playing with formal systems
, 2003
"... TinkerType is a pragmatic framework for compact and modular description of formal systems (type systems, operational semantics, logics, etc.). A family of related systems is broken down into a set of clauses – individual inference rules – and a set of features controlling the inclusion of clauses in ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
TinkerType is a pragmatic framework for compact and modular description of formal systems (type systems, operational semantics, logics, etc.). A family of related systems is broken down into a set of clauses – individual inference rules – and a set of features controlling the inclusion of clauses in particular systems. Simple static checks are used to help maintain consistency of the generated systems. We present TinkerType and its implementation and describe its application to two substantial repositories of typed lambdacalculi. The first repository covers a broad range of typing features, including subtyping, polymorphism, type operators and kinding, computational effects, and dependent types. It describes both declarative and algorithmic aspects of the systems, and can be used with our tool, the TinkerType Assembler,to generate calculi either in the form of typeset collections of inference rules or as executable ML typecheckers. The second repository addresses a smaller collection of systems, and provides modularized proofs of basic safety properties.
SemanticsBased Compiling: A Case Study in TypeDirected Partial Evaluation
 Eighth International Symposium on Programming Language Implementation and Logic Programming
"... . We illustrate a simple and e#ective solution to semanticsbased compiling. Our solution is based on "typedirected partial evaluation", and  our compiler generator is expressed in a few lines, and is e#cient;  its input is a welltyped, purely functional definitional interpreter i ..."
Abstract

Cited by 24 (8 self)
 Add to MetaCart
(Show Context)
. We illustrate a simple and e#ective solution to semanticsbased compiling. Our solution is based on "typedirected partial evaluation", and  our compiler generator is expressed in a few lines, and is e#cient;  its input is a welltyped, purely functional definitional interpreter in the style of denotational semantics;  the output of the generated compiler is e#ectively threeaddress code, in the fashion and e#ciency of the Dragon Book;  the generated compiler processes several hundred lines of source code per second. The source language considered in this case study is imperative, blockstructured, higherorder, callbyvalue, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semanticsbased compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs by specializing a definitional interpreter with respect to the program. Specialization is carri...
A Trace Model for Pointers and Objects
 In Proc. 13th ECOOP, volume 1628 of LNCS
, 1999
"... Objectoriented programs [Dahl, Goldberg, Meyer] are notoriously prone to the following kinds of error, which could lead to increasingly severe problems in the presence of tasking 1. Following a null pointer 2. Deletion of an accessible object 3. Failure to delete an inaccessible object 4. Interfere ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
Objectoriented programs [Dahl, Goldberg, Meyer] are notoriously prone to the following kinds of error, which could lead to increasingly severe problems in the presence of tasking 1. Following a null pointer 2. Deletion of an accessible object 3. Failure to delete an inaccessible object 4. Interference due to equality of pointers 5. Inhibition of optimisation due to fear of (4) Type disciplines and object classes are a great help in avoiding these errors. Stronger protection may be obtainable with the help of assertions, particularly invariants, which are intended to be true before and after each call of a method that updates the structure of the heap. This note introduces a mathematical model and language for the formulation of assertions about objects and pointers, and suggests that a graphical calculus [Curtis, Lowe] may help in reasoning about program correctness. It deals with both garbagecollected heaps and the other kind. The theory is based on a trace model of graphs, using ideas from process algebra; and our development seeks to exploit this analogy as a unifying principle. 1