Results 1  10
of
25
Games and Full Abstraction for the Lazy lambdacalculus
 In Proceedings, Tenth Annual IEEE Symposium on Logic in Computer Science
, 1995
"... ion for the Lazy calculus Samson Abramsky Guy McCusker Department of Computing Imperial College of Science, Technology and Medicine 180 Queen's Gate London SW7 2BZ United Kingdom Abstract We define a category of games G, and its extensional quotient E . A model of the lazy calculus, a typ ..."
Abstract

Cited by 149 (9 self)
 Add to MetaCart
ion for the Lazy calculus Samson Abramsky Guy McCusker Department of Computing Imperial College of Science, Technology and Medicine 180 Queen's Gate London SW7 2BZ United Kingdom Abstract We define a category of games G, and its extensional quotient E . A model of the lazy calculus, a typefree functional language based on evaluation to weak head normal form, is given in G, yielding an extensional model in E . This model is shown to be fully abstract with respect to applicative simulation. This is, so far as we know, the first purely semantic construction of a fully abstract model for a reflexivelytyped sequential language. 1 Introduction Full Abstraction is a key concept in programming language semantics [9, 12, 23, 26]. The ingredients are as follows. We are given a language L, with an `observational preorder'  on terms in L such that P  Q means that every observable property of P is also satisfied by Q; and a denotational model MJ\DeltaK. The model M is then said to be f...
Programming Languages and Dimensions
, 1996
"... Scientists and engineers must ensure that the equations and formulae which they use are dimensionally consistent, but existing programming languages treat all numeric values as dimensionless. This thesis investigates the extension of programming languages to support the notion of physical dimension. ..."
Abstract

Cited by 42 (3 self)
 Add to MetaCart
(Show Context)
Scientists and engineers must ensure that the equations and formulae which they use are dimensionally consistent, but existing programming languages treat all numeric values as dimensionless. This thesis investigates the extension of programming languages to support the notion of physical dimension. A type system is presented similar to that of the programming language ML but extended with polymorphic dimension types. An algorithm which infers most general dimension types automatically is then described and proved correct. The semantics of the language is given by a translation into an explicitlytyped language in which dimensions are passed as arguments to functions. The operational semantics of this language is specified in the usual way by an evaluation relation defined by a set of rules. This is used to show that if a program is welltyped then no dimension errors can occur during its evaluation. More abstract properties of the language are investigated using a denotational semantics: these include a notion of invariance under changes in the units of measure used, analogous to parametricity in the polymorphic lambda calculus. Finally the dissertation is summarised and many possible directions for future research in dimension types and related type systems are described. i ii
Games and full abstraction for nondeterministic languages
, 1999
"... Abstract Nondeterminism is a pervasive phenomenon in computation. Often it arises as an emergent property of a complex system, typically as the result of contention for access to shared resources. In such circumstances, we cannot always know, in advance, exactly what will happen. In other circumstan ..."
Abstract

Cited by 36 (3 self)
 Add to MetaCart
(Show Context)
Abstract Nondeterminism is a pervasive phenomenon in computation. Often it arises as an emergent property of a complex system, typically as the result of contention for access to shared resources. In such circumstances, we cannot always know, in advance, exactly what will happen. In other circumstances, nondeterminism is explicitly introduced as a means of abstracting away from implementation details such as precise command scheduling and control flow. However, the kind of behaviours exhibited by nondeterministic computations can be extremely subtle in comparison to those of their deterministic counterparts and reasoning about such programs is notoriously tricky as a result. It is therefore important to develop semantic tools to improve our understanding of, and aid our reasoning about, such nondeterministic programs. In this thesis, we extend the framework of game semantics to encompass nondeterministic computation. Game semantics is a relatively recent development in denotational semantics; its main novelty is that it views a computation not as a static entity, but rather as a dynamic process of interaction. This perspective makes the theory wellsuited to modelling many aspects of computational processes: the original use of game semantics in modelling the simple functional language PCF has subsequently been extended to handle more complex control structures such as references and continuations.
A Tutorial on Coinduction and Functional Programming
 IN GLASGOW FUNCTIONAL PROGRAMMING WORKSHOP
, 1994
"... Coinduction is an important tool for reasoning about unbounded structures. This tutorial explains the foundations of coinduction, and shows how it justifies intuitive arguments about lazy streams, of central importance to lazy functional programmers. We explain from first principles a theory based ..."
Abstract

Cited by 30 (1 self)
 Add to MetaCart
Coinduction is an important tool for reasoning about unbounded structures. This tutorial explains the foundations of coinduction, and shows how it justifies intuitive arguments about lazy streams, of central importance to lazy functional programmers. We explain from first principles a theory based on a new formulation of bisimilarity for functional programs, which coincides exactly with Morrisstyle contextual equivalence. We show how to prove properties of lazy streams by coinduction and derive Bird and Wadler's Take Lemma, a wellknown proof technique for lazy streams.
Basic Observables for Processes
 Information and Computation
, 1999
"... A general approach for defining behavioural preorders over process terms as the maximal precongruences induced by basic observables is examined. Three different observables, that provide information about the initial communication capabilities of processes and about the possibility that processes ..."
Abstract

Cited by 24 (7 self)
 Add to MetaCart
(Show Context)
A general approach for defining behavioural preorders over process terms as the maximal precongruences induced by basic observables is examined. Three different observables, that provide information about the initial communication capabilities of processes and about the possibility that processes get engaged in divergent computations, will be considered. We show that the precongruences induced by our basic observables coincide with intuitive and/or widely studied behavioural preorders. In particular, we retrieve in our setting the must preorder of De Nicola and Hennessy and the fair/should preorder introduced by Cleaveland and Natarajan and by Brinksma, Rensink and Vogler. A new form of testing preorder, which we call safemust, also emerges. The alternative characterizations we offer shed light on the differences between these preorders, and on the role played in their definition by tests for divergence. 1 Introduction In the classical theory of functional programming, the point...
BöhmLike Trees for Rewriting
"... The work in this thesis has been carried out under the auspices of the research school IPA (Institute for Programming research and Algorithmics).vrije universiteit BöhmLike Trees for Rewriting academisch proefschrift ter verkrijging van de graad Doctor aan de Vrije Universiteit Amsterdam, op gezag ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
The work in this thesis has been carried out under the auspices of the research school IPA (Institute for Programming research and Algorithmics).vrije universiteit BöhmLike Trees for Rewriting academisch proefschrift ter verkrijging van de graad Doctor aan de Vrije Universiteit Amsterdam, op gezag van de rector magnificus prof.dr. T. Sminia, in het openbaar te verdedigen ten overstaan van de promotiecommissie van de faculteit der Exacte Wetenschappen op maandag 20 maart 2006 om 15.45 uur in de aula van de universiteit, De Boelelaan 1105 door
Programming Language Semantics
 In CRC Handbook of Computer Science
, 1995
"... interpretation provides the theory that allows a compiler writer to prove the correctness of compilers. Finally, axiomatic semantics is a longstanding fundamental technique for validating the correctness of computer code. Recent emphasis on largescale and safetycritical systems has again placed t ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
(Show Context)
interpretation provides the theory that allows a compiler writer to prove the correctness of compilers. Finally, axiomatic semantics is a longstanding fundamental technique for validating the correctness of computer code. Recent emphasis on largescale and safetycritical systems has again placed the spotlight on this technique. Current research on data type theory [5] suggests that a marriage between the techniques of datatype checking and axiomatic semantics is not far in the future. 4 Research Issues in Semantics The techniques in this chapter have proved highly successful for defining, improving, and implementing traditional, sequential programming languages. But new language paradigms present new challenges to the semantics methods. In the functional programming paradigm, a higherorder functional language can use functions as arguments to other functions. This makes the language's domains more complex than those in Figure 2. Denotational semantics can be used to understand the...
A stable programming language
 I&C
"... It is wellknown that stable models (as dIdomains, qualitative domains and coherence spaces) are not fully abstract for the languagePCF. This fact is related to the existence of stable parallel functions and of stable functions that are not monotone with respect to the extensional order, which cann ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
(Show Context)
It is wellknown that stable models (as dIdomains, qualitative domains and coherence spaces) are not fully abstract for the languagePCF. This fact is related to the existence of stable parallel functions and of stable functions that are not monotone with respect to the extensional order, which cannot be defined by programs ofPCF. In this paper, a paradigmatic programming language namedStPCF is proposed, which extends the languagePCF with two additional operators. The operational description of the extended language is presented in an effective way, although the evaluation of one of the new operators cannot be formalized in a PCFlike rewrite system. SinceStPCF can define all finite cliques of coherence spaces the above gap with stable models is filled, consequently stable models are fully abstract for the extended language. 1
Models of Lambda Calculi and Linear Logic: Structural, Equational and ProofTheoretic Characterisations
, 1994
"... Models of Lambda Calculi and Linear Logic: Structural, Equational and ProofTheoretic Characterisations Ralph Loader, of St. Hugh's College, Oxford. Thesis submitted for the Degree of D.Phil. Michaelmas term, 1994. T his thesis is an investigation into models of typed calculi and of linear l ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Models of Lambda Calculi and Linear Logic: Structural, Equational and ProofTheoretic Characterisations Ralph Loader, of St. Hugh's College, Oxford. Thesis submitted for the Degree of D.Phil. Michaelmas term, 1994. T his thesis is an investigation into models of typed calculi and of linear logic. The models we investigate are denotational in nature; we construct various categories, in which types (or formulae) are interpreted by objects, and terms (proofs) by morphisms. The results we investigate compare particular properties of the syntax and the semantics of a calculus, by trying to use syntax to characterise features of a model, or vice versa. There are four chapters in the thesis, one each on linear logic and the simply typed calculus, and two on inductive datatypes. In chapter one, we look at some models of linear logic, and prove a full completeness result for multiplicative linear logic. We form a model, the linear logical predicates , by abstracting a little the structure ...
Probability, Nondeterminism and Concurrency: Two Denotational Models for Probabilistic Computation
 PHD THESIS, UNIV. AARHUS, 2003. BRICS DISSERTATION SERIES
, 2003
"... Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular t ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations and the nondeterministic powerdomain. By means of an equational theory we give an alternative characterisation of indexed valuations and the distributive law. We study the relation between valuations and indexed valuations. Finally we use indexed valuations to give a semantics to a programming language. This semantics reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures are a model for concurrent computation that account for causal relations between events. We propose a way of adding probabilities to confusion free event structures, defining the notion of probabilistic event structure. This leads to various ideas of a run for probabilistic event structures. We show a confluence theorem for such runs. Configurations of a confusion free event structure form a distributive concrete domain. We give a representation theorem which characterises completely the powerdomain of valuations of such concrete domains in terms of prob...