Results 11  20
of
25
Specification and Verification I
"... These lecture notes are for the course entitled Specification and Verification I. Some of the material is derived from previously published sources. 1 Chapters 1–4 introduce classical ideas of specification and proof of programs due to Floyd and Hoare. Chapter 5 is an introduction to program refinem ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
These lecture notes are for the course entitled Specification and Verification I. Some of the material is derived from previously published sources. 1 Chapters 1–4 introduce classical ideas of specification and proof of programs due to Floyd and Hoare. Chapter 5 is an introduction to program refinement using an approach due to Paul Curzon. Chapter 6 presents higher order logic and Chapter 7 explains how FloydHoare logic can be embedded in higher order logic. The course presents classical ideas on the specification and verification of software. Although much of the material is old – see the dates on some of the cited references – it is still a key foundation for current research. 2 This course is a prerequisite for the Part II course entitled Specification and Verification II, which makes extensive use of higher order logic (see Chapter 6) for specifying and verifying hardware. Learning Guide These notes contain all the material that will be covered in the course. It should thus not be necessary to consult any textbooks etc. The copies of transparencies give the contents of the lectures. However note that I sometimes end up going faster or slower than expected so, for example, material shown in Lecture n might actually get covered in Lecture n+1 or Lecture n−1. The examination questions will be based on material in the lectures. Thus if I end up not covering some topic in the lectures, then I would not expect to set an examination question on it. This course has been fairly stable for several years, so past exam questions are a reasonable guide to the sort of thing I will set this year.
The synthetic Plotkin powerdomain
, 1990
"... Plotkin [1976] introduced a powerdomain construction on domains in order to give semantics to a nondeterministic binary choice constructor, and later [1979] characterised it as the free semilattice. Smyth [1983] and Winskel [1985] showed that it could be interpreted in terms of modal predicate tran ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Plotkin [1976] introduced a powerdomain construction on domains in order to give semantics to a nondeterministic binary choice constructor, and later [1979] characterised it as the free semilattice. Smyth [1983] and Winskel [1985] showed that it could be interpreted in terms of modal predicate transformers and Robinson [1986] recognised it as a special case of Johnstone’s [1982] Vietoris construction, which itself generalises the Hausdorff metric on the set of closed subsets of a metric space. The domain construction involves a curious order relation known as the EgliMilner order. In this paper we relate the powerdomain directly to the free semilattice, which in a topos is simply the finite powerset, i.e. the object of (Kuratowski)finite subobjects of an object. We show that the EgliMilner order coincides (up to “¬¬”) with the intrinsic order induced by a family of “observable predicates.” This problem originally arose in the context of the Effective topos, in which the observable predicates are the recursively enumerable subsets. However we find that the results of this paper hold for any elementary topos, and so by considering a (pre)sheaf topos (which the Effective topos is not) we may compare them with the classical approach. Important Note: Much of the credit for the work in this paper is due to Wesley Phoa and Martin Hyland, but I take the blame for its presentation. Comments on it are most welcome. When it is finished it will be submitted as a joint paper with Wesley Phoa, and an announcement will be made on types.
Kozen, Mardare, and Panangaden A Metrized Duality Theorem for Markov Processes
"... We extend our previous duality theorem for Markov processes by equipping the processes with a pseudometric and the algebras with a notion of metric diameter. We are able to show that the isomorphisms of our previous duality theorem become isometries in this quantitative setting. This opens the way t ..."
Abstract
 Add to MetaCart
(Show Context)
We extend our previous duality theorem for Markov processes by equipping the processes with a pseudometric and the algebras with a notion of metric diameter. We are able to show that the isomorphisms of our previous duality theorem become isometries in this quantitative setting. This opens the way to developing theories of approximate reasoning for probabilistic systems.
On Specification Carrying Software, its Refinement and Composition
"... In this paper, we present the framework of evolving specifications (especs), implementing, in the categorical setting of algebraic specifications, a logical view of state, known from Hoare logic to abstract state machines (evolving algebras). The categorical support for both topdown and bottomup d ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper, we present the framework of evolving specifications (especs), implementing, in the categorical setting of algebraic specifications, a logical view of state, known from Hoare logic to abstract state machines (evolving algebras). The categorical support for both topdown and bottomup development is thus extended from the refinement and composition of the structure of programs, to the refinement and composition of their behaviors. While they were originally defined as specifications carrying state machines, especs can also be viewed from another angle, as software components carrying their specifications. As firstclass citizens of software systems, specifications are thus made available both statically, as generalized interfaces, and dynamically, as the carriers of adaptability. From this point of view, especs seem particularly suitable for capturing and analyzing the dynamic aspects of architectural composition. In this expository note, we shall survey the main ideas, and outline some examples, including a summary of a method for analysis and transformation of security protocols, where runtime architectural changes result from the internal dynamics of connectors, or components. In such cases, an architectural view with the abstraction level predetermined by the features of the chosen architecture description language, may conceal the essence, whereas the practical application requires versatility. A. “Software Philosophy” I.
REASONING TRADEOFFS IN IMPLICIT INVOCATION AND ASPECT ORIENTED LANGUAGES
, 2015
"... Verification — Formal methods, programming by contract; F.3.1 [Log ..."
(Show Context)
2 Table of Contents Criojo: A Pivot Language for ServiceOriented Computing............. 1
"... ha l0 ..."
(Show Context)
found at the ENTCS Macro Home Page. A Metrized Duality Theorem for Markov Processes
"... this file with prentcsmacro.sty for your meeting, ..."
(Show Context)
unknown title
"... Chapter 5 is an introduction to program refinement using an approach due to Paul Curzon. Chapter 6 presents higher order logic and Chapter 7 explains how FloydHoare logic can be embedded in higher order logic. The course presents classical ideas on the specification and verification of software. Al ..."
Abstract
 Add to MetaCart
(Show Context)
Chapter 5 is an introduction to program refinement using an approach due to Paul Curzon. Chapter 6 presents higher order logic and Chapter 7 explains how FloydHoare logic can be embedded in higher order logic. The course presents classical ideas on the specification and verification of software. Although much of the material is old see the dates on some of the cited references it is still a key foundation for current research.2 This course is a prerequisite for the Part II course entitled Specification and Verification II, which makes extensive use of higher order logic (see Chapter 6) for specifying and verifying hardware. Learning Guide These notes contain all the material that will be covered in the course. It should thus not be necessary to consult any textbooks etc. The copies of transparencies give the contents of the lectures. However note that I sometimes end up going faster or slower than expected so, for example, material shown in Lecture n might actually get covered in Lecture n+1 or Lecture n1. The examination questions will be based on material in the lectures. Thus if I end up not covering some topic in the lectures, then I would not expect to set an examination question on it. This course has been fairly stable for several years, so past exam questions are a reasonable guide to the sort of thing I will set this year.