Results 1  10
of
11
Domain Theory in Logical Form
 Annals of Pure and Applied Logic
, 1991
"... The mathematical framework of Stone duality is used to synthesize a number of hitherto separate developments in Theoretical Computer Science: • Domain Theory, the mathematical theory of computation introduced by Scott as a foundation for denotational semantics. • The theory of concurrency and system ..."
Abstract

Cited by 252 (10 self)
 Add to MetaCart
(Show Context)
The mathematical framework of Stone duality is used to synthesize a number of hitherto separate developments in Theoretical Computer Science: • Domain Theory, the mathematical theory of computation introduced by Scott as a foundation for denotational semantics. • The theory of concurrency and systems behaviour developed by Milner, Hennessy et al. based on operational semantics. • Logics of programs. Stone duality provides a junction between semantics (spaces of points = denotations of computational processes) and logics (lattices of properties of processes). Moreover, the underlying logic is geometric, which can be computationally interpreted as the logic of observable properties—i.e. properties which can be determined to hold of a process on the basis of a finite amount of information about its execution. These ideas lead to the following programme:
Linearity, Sharing and State: a fully abstract game semantics for Idealized Algol with active expressions
 ALGOLLIKE LANGUAGES
, 1997
"... The manipulation of objects with state which changes over time is allpervasive in computing. Perhaps the simplest example of such objects are the program variables of classical imperative languages. An important strand of work within the study of such languages, pioneered by John Reynolds, focusses ..."
Abstract

Cited by 130 (21 self)
 Add to MetaCart
The manipulation of objects with state which changes over time is allpervasive in computing. Perhaps the simplest example of such objects are the program variables of classical imperative languages. An important strand of work within the study of such languages, pioneered by John Reynolds, focusses on "Idealized Algol", an elegant synthesis of imperative and functional features. We present a novel semantics for Idealized Algol using games, which is quite unlike traditional denotational models of state. The model takes into account the irreversibility of changes in state, and makes explicit the difference between copying and sharing of entities. As a formal measure of the accuracy of our model, we obtain a full abstraction theorem for Idealized Algol with active expressions.
The SheafTheoretic Structure Of NonLocality and Contextuality
, 2011
"... Locality and noncontextuality are intuitively appealing features of classical physics, which are contradicted by quantum mechanics. The goal of the classic nogo theorems by Bell, KochenSpecker, et al. is to show that nonlocality and contextuality are necessary features of any theory whose predic ..."
Abstract

Cited by 36 (11 self)
 Add to MetaCart
Locality and noncontextuality are intuitively appealing features of classical physics, which are contradicted by quantum mechanics. The goal of the classic nogo theorems by Bell, KochenSpecker, et al. is to show that nonlocality and contextuality are necessary features of any theory whose predictions agree with those of quantum mechanics. We use the mathematics of sheaf theory to analyze the structure of nonlocality and contextuality in a very general setting. Starting from a simple experimental scenario, and the kind of probabilistic models familiar from discussions of Bell’s theorem, we show that there is a very direct, compelling formalization of these notions in sheaftheoretic terms. Moreover, on the basis of this formulation, we show that the phenomena of nonlocality and contextuality can be characterized precisely in terms of obstructions to the existence of global sections. We give linear algebraic methods for computing these obstructions, and use these methods to obtain a number of new insights into nonlocality and contextuality. For example, we distinguish a proper hierarchy of strengths of nogo theorems, and show that three leading examples — due to Bell, Hardy, and Greenberger, Horne and Zeilinger, respectively — occupy successively higher levels of this hierarchy. We show how our abstract setting can be represented in quantum mechanics. In doing so, we uncover a strengthening of the usual nosignalling theorem, which shows that quantum mechanics obeys nosignalling for arbitrary families of commuting observables, not just those represented on different factors of a tensor product.
Consistency of the Theory of Contexts
, 2001
"... The Theory of Contexts is a typetheoretic axiomatization which has been recently proposed by some of the authors for giving a metalogical account of the fundamental notions of variable and context as they appear in Higher Order Abstract Syntax. In this paper, we prove that this theory is consistent ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
The Theory of Contexts is a typetheoretic axiomatization which has been recently proposed by some of the authors for giving a metalogical account of the fundamental notions of variable and context as they appear in Higher Order Abstract Syntax. In this paper, we prove that this theory is consistent by building a model based on functor categories. By means of a suitable notion of forcing, we prove that this model validates Classical Higher Order Logic, the Theory of Contexts, and also (parametrised) structural induction and recursion principles over contexts. The approach we present in full detail should be useful also for reasoning on other models based on functor categories. Moreover, the construction could be adopted, and possibly generalized, also for validating other theories of names and binders. Contents 1 The object language 4 2 The metalanguage (Framework System #) 6 2.1 Syntax 6 2.2 Typing and logical judgements 7 2.3 Adequacy of the encoding 8 2.4 Remarks on the design of # 9 3 Categorytheoretic preliminaries 11 4.1 The ambient categories 4.2 Interpreting types 16 4.3 Interpreting environments 18 4.4 Interpreting the typing judgement of terms 19 4.5 Interpreting logical judgements 21 is a model of # 22 5.1 Forcing 22 5.2 Characterisation of Leibniz equality 23 models logical axioms and rules 26 models the Theory of Contexts 27 6 Recursion 28 6.1 Firstorder recursion 28 6.2 Higherorder recursion 31 7 Induction 33 7.1 Firstorder induction 34 7.2 Higherorder induction 37 8 Connections with tripos theory 38 9 Related work 41 9.1 Semantics based on functor categories 41 9.2 Logics for nominal calculi 44 10 Conclusions 45 A Proofs 46 A.1 Proof of Proposition 4.2 46 A.2 Proof of Proposition 4.3 47 A.3 Proof of Theorem 5.1 48 A.4 Proof of...
Semantical Analysis of HigherOrder Syntax
 In 14th Annual Symposium on Logic in Computer Science
, 1999
"... this paper to advocate the use of functor categories as a semantic foundation of higherorder abstract syntax (HOAS). By way of example, we will show how functor categories can be used for at least the following applications: ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
this paper to advocate the use of functor categories as a semantic foundation of higherorder abstract syntax (HOAS). By way of example, we will show how functor categories can be used for at least the following applications:
A Denotational Approach to Measuring Complexity in Functional Programs
, 2003
"... ..."
(Show Context)
Denotational Semantics Using an OperationallyBased Term Model
 In Proc. 24th ACM Symposium on Principles of Programming Languages
, 1997
"... We introduce a method for proving the correctness of transformations of programs in languages like Scheme and ML. The method consists of giving the programs a denotational semantics in an operationallybased term model in which interaction is the basic observable, and showing that the transformation ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We introduce a method for proving the correctness of transformations of programs in languages like Scheme and ML. The method consists of giving the programs a denotational semantics in an operationallybased term model in which interaction is the basic observable, and showing that the transformation is meaningpreserving. This allows us to consider correctness for programs that interact with their environment without terminating, and also for transformations that change the internal store behavior of the program. We illustrate the technique on one of the MeyerSieber examples, and we use it to prove the correctness of assignment elimination for Scheme. The latter is an important but subtle step for Scheme compilers; we believe ours is the first proof of its correctness. 1 Introduction Compilers for higherorder languages typically perform elaborate program transformations in order to improve performance. Such transformations often change the storage behavior of the program. For concre...
A Unified SheafTheoretic Account Of NonLocality and Contextuality
, 2011
"... A number of landmark results in the foundations of quantum mechanics show that quantum systems exhibit behaviour that defies explanation in classical terms, and that cannot be accounted for in such terms even by postulating “hidden variables” as additional unobserved factors. Much has been written o ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
A number of landmark results in the foundations of quantum mechanics show that quantum systems exhibit behaviour that defies explanation in classical terms, and that cannot be accounted for in such terms even by postulating “hidden variables” as additional unobserved factors. Much has been written on these matters, but there is surprisingly little unanimity even on basic definitions or the interrelationships among the various concepts and results. We use the mathematical language of sheaves and monads to give a very general and mathematically robust description of the behaviour of systems in which one or more measurements can be selected, and one or more outcomes observed. We say that an empirical model is extendable if it can be extended consistently to all sets of measurements, regardless of compatibility. A hiddenvariable model is factorizable if, for each value of the hidden variable, it factors as a product of distributions on the basic measurements. We prove that an empirical model is extendable if and only if there is a factorizable hiddenvariable model which realizes it. From this we are able to prove generalized versions of wellknown NoGo theorems. At the conceptual level, our equivalence result says that the existence of incompatible measurements is the essential ingredient in nonlocal and contextual behavior in quantum mechanics.