Results 1  10
of
3,255
Coinductive Characterisations Reveal Nice Relations Between Preorders and Equivalences
, 2008
"... There are two ways to define a semantics for process algebras: either directly by means of an equivalence relation or by means of a preorder whose kernel is the desired equivalence. We are interested in the relationship between these two presentations. Using our characterisation of the behaviour pre ..."
Abstract
 Add to MetaCart
There are two ways to define a semantics for process algebras: either directly by means of an equivalence relation or by means of a preorder whose kernel is the desired equivalence. We are interested in the relationship between these two presentations. Using our characterisation of the behaviour
A calculus for cryptographic protocols: The spi calculus
 Information and Computation
, 1999
"... We introduce the spi calculus, an extension of the pi calculus designed for the description and analysis of cryptographic protocols. We show how to use the spi calculus, particularly for studying authentication protocols. The pi calculus (without extension) suffices for some abstract protocols; the ..."
Abstract

Cited by 919 (55 self)
 Add to MetaCart
; the spi calculus enables us to consider cryptographic issues in more detail. We represent protocols as processes in the spi calculus and state their security properties in terms of coarsegrained notions of protocol equivalence.
A Compositional Approach to Performance Modelling
, 1996
"... Performance modelling is concerned with the capture and analysis of the dynamic behaviour of computer and communication systems. The size and complexity of many modern systems result in large, complex models. A compositional approach decomposes the system into subsystems that are smaller and more ea ..."
Abstract

Cited by 746 (102 self)
 Add to MetaCart
. These techniques are presented in terms of notions of equivalence between modelling entities. A framewo...
Coinduction for preordered algebra
"... We develop a combination, called hidden preordered algebra, between preordered algebra, which is an algebraic framework supporting specification and reasoning about transitions, and hidden algebra, which is the algebraic framework for behavioural specification. This combination arises naturally with ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We develop a combination, called hidden preordered algebra, between preordered algebra, which is an algebraic framework supporting specification and reasoning about transitions, and hidden algebra, which is the algebraic framework for behavioural specification. This combination arises naturally
Domain Theory
 Handbook of Logic in Computer Science
, 1994
"... Least fixpoints as meanings of recursive definitions. ..."
Abstract

Cited by 546 (25 self)
 Add to MetaCart
Least fixpoints as meanings of recursive definitions.
Generalised Coinduction
, 2001
"... We introduce the lambdacoiteration schema for a distributive law lambda of a functor T over a functor F. Under certain conditions it can be shown to uniquely characterise functions into the carrier of a final Fcoalgebra, generalising the basic coiteration schema as given by finality. The duals of ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
We introduce the lambdacoiteration schema for a distributive law lambda of a functor T over a functor F. Under certain conditions it can be shown to uniquely characterise functions into the carrier of a final Fcoalgebra, generalising the basic coiteration schema as given by finality. The duals
ControlFlow Analysis of HigherOrder Languages
, 1991
"... representing the official policies, either expressed or implied, of ONR or the U.S. Government. Keywords: dataflow analysis, Scheme, LISP, ML, CPS, type recovery, higherorder functions, functional programming, optimising compilers, denotational semantics, nonstandard Programs written in powerful, ..."
Abstract

Cited by 362 (10 self)
 Add to MetaCart
representing the official policies, either expressed or implied, of ONR or the U.S. Government. Keywords: dataflow analysis, Scheme, LISP, ML, CPS, type recovery, higherorder functions, functional programming, optimising compilers, denotational semantics, nonstandard Programs written in powerful, higherorder languages like Scheme, ML, and Common Lisp should run as fast as their FORTRAN and C counterparts. They should, but they don’t. A major reason is the level of optimisation applied to these two classes of languages. Many FORTRAN and C compilers employ an arsenal of sophisticated global optimisations that depend upon dataflow analysis: commonsubexpression elimination, loopinvariant detection, inductionvariable elimination, and many, many more. Compilers for higherorder languages do not provide these optimisations. Without them, Scheme, LISP and ML compilers are doomed to produce code that runs slower than their FORTRAN and C counterparts. The problem is the lack of an explicit controlflow graph at compile time, something which traditional dataflow analysis techniques require. In this dissertation, I present a technique for recovering the controlflow graph of a Scheme program at compile time. I give examples of how this information can be used to perform several dataflow analysis optimisations, including copy propagation, inductionvariable elimination, uselessvariable elimination, and type recovery. The analysis is defined in terms of a nonstandard semantic interpretation. The denotational semantics is carefully developed, and several theorems establishing the correctness of the semantics and the implementing algorithms are proven. iii ivTo my parents, Julia and Olin. v viContents
On Coinductive Equivalences for HigherOrder Probabilistic Functional Programs
"... We study bisimulation and context equivalence in a probabilistic λcalculus. The contributions of this paper are threefold. Firstly we show a technique for proving congruence of probabilistic applicative bisimilarity. While the technique follows Howe’s method, some of the technicalities are quite d ..."
Abstract
 Add to MetaCart
different, relying on nontrivial “disentangling ” properties for sets of real numbers. Secondly we show that, while bisimilarity is in general strictly finer than context equivalence, coincidence between the two relations is attained on pure λterms. The resulting equality is that induced by Levy
On Coinduction and Quantum Lambda Calculi∗
"... In the ubiquitous presence of linear resources in quantum computation, program equivalence in linear contexts, where programs are used or executed once, is more important than in the classical setting. We introduce a linear contextual equivalence and two notions of bisimilarity, a statebased and a ..."
Abstract
 Add to MetaCart
distributionbased, as proof techniques for reasoning about higherorder quantum programs. Both notions of bisimilarity are sound with respect to the linear contextual equivalence, but only the distributionbased one turns out to be complete. The completeness proof relies on a characterisation
Results 1  10
of
3,255