Results 1  10
of
18
Interpretation of locales in Isabelle: Theories and proof contexts
 MATHEMATICAL KNOWLEDGE MANAGEMENT (MKM 2006), LNAI 4108
, 2006
"... The generic proof assistant Isabelle provides a landscape of specification contexts that is considerably richer than that of most other provers. Theories are the level of specification where objectlogics are axiomatised. Isabelle’s proof language Isar enables local exploration in contexts generated ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
The generic proof assistant Isabelle provides a landscape of specification contexts that is considerably richer than that of most other provers. Theories are the level of specification where objectlogics are axiomatised. Isabelle’s proof language Isar enables local exploration in contexts generated in the course of natural deduction proofs. Finally, locales, which may be seen as detached proof contexts, offer an intermediate level of specification geared towards reuse. All three kinds of contexts are structured, to different extents. We analyse the “topology ” of Isabelle’s landscape of specification contexts, by means of development graphs, in order to establish what kinds of reuse are possible.
Axiomatic constructor classes in Isabelle/HOLCF
 In In Proc. 18th International Conference on Theorem Proving in Higher Order Logics (TPHOLs ’05), Volume 3603 of Lecture Notes in Computer Science
, 2005
"... Abstract. We have definitionally extended Isabelle/HOLCF to support axiomatic Haskellstyle constructor classes. We have subsequently defined the functor and monad classes, together with their laws, and implemented state and resumption monad transformers as generic constructor class instances. This ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
Abstract. We have definitionally extended Isabelle/HOLCF to support axiomatic Haskellstyle constructor classes. We have subsequently defined the functor and monad classes, together with their laws, and implemented state and resumption monad transformers as generic constructor class instances. This is a step towards our goal of giving modular denotational semantics for concurrent lazy functional programming languages, such as GHC Haskell. 1
The Semantics of C++ Data Types: Towards Verifying LowLevel System Components
, 2003
"... Data[Semantics int] dt int exists : Axiom Exists (x: (pod data type?[Semantics int])): True dt int : (pod data type?[Semantics int]) End Cxx Int The identifiers with sshort refer to the corresponding items from the semantics of signed short. First we declare the size of the value representation, ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
Data[Semantics int] dt int exists : Axiom Exists (x: (pod data type?[Semantics int])): True dt int : (pod data type?[Semantics int]) End Cxx Int The identifiers with sshort refer to the corresponding items from the semantics of signed short. First we declare the size of the value representation, this becomes important for the unsigned integer types, see below. We define the value type Semantics int as a predicate subtype of the PVS integer type int. The axioms int longer and int contains sshort formalise the requirement that "[short int] provides at least as much storage as [int]" (3.9.1 (2)).
A Design Structure for Higher Order Quotients
 In Proc. of the 18th International Conference on Theorem Proving in Higher Order Logics (TPHOLs), volume 3603 of LNCS
, 2005
"... Abstract. The quotient operation is a standard feature of set theory, where a set is partitioned into subsets by an equivalence relation. We reinterpret this idea for higher order logic, where types are divided by an equivalence relation to create new types, called quotient types. We present a desig ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Abstract. The quotient operation is a standard feature of set theory, where a set is partitioned into subsets by an equivalence relation. We reinterpret this idea for higher order logic, where types are divided by an equivalence relation to create new types, called quotient types. We present a design to mechanically construct quotient types as new types in the logic, and to support the automatic lifting of constants and theorems about the original types to corresponding constants and theorems about the quotient types. This design exceeds the functionality of Harrison’s package, creating quotients of multiple mutually recursive types simultaneously, and supporting the equivalence of aggregate types, such as lists and pairs. Most importantly, this design supports the creation of higher order quotients, which enable the automatic lifting of theorems with quantification over functions of any higher order. 1
PSOS Revisited
, 2003
"... This paper provides a retrospective view of the design of SRI's Provably Secure Operating System (PSOS), a formally specified taggedcapability hierarchical system architecture. It examines PSOS in the light of what has happened in computer system developments since 1980, and assesses the relevance ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
This paper provides a retrospective view of the design of SRI's Provably Secure Operating System (PSOS), a formally specified taggedcapability hierarchical system architecture. It examines PSOS in the light of what has happened in computer system developments since 1980, and assesses the relevance of the PSOS concepts in that light.
Reasoning about the Reliability Of Diverse TwoChannel Systems In which One Channel is “Possibly Perfect”
, 2009
"... should appear on the left and oddnumbered pages on the right when opened as a doublepage This report refines and extends an earlier paper by the first author [25]. It considers the problem of reasoning about the reliability of faulttolerant systems with two “channels” (i.e., components) of which o ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
should appear on the left and oddnumbered pages on the right when opened as a doublepage This report refines and extends an earlier paper by the first author [25]. It considers the problem of reasoning about the reliability of faulttolerant systems with two “channels” (i.e., components) of which one, A, because it is conventionally engineered and presumed to contain faults, supports only a claim of reliability, while the other, B, by virtue of extreme simplicity and extensive analysis, supports a plausible claim of “perfection.” We begin with the case where either channel can bring the system to a safe state. The reasoning about system probability of failure on demand (pfd) is divided into two steps. The first concerns aleatory uncertainty about (i) whether channel A will fail on a randomly selected demand and (ii) whether channel B is imperfect. It is shown that, conditional upon knowing pA (the probability that A fails on a randomly selected demand) and pB (the probability that channel B is imperfect), a conservative bound on the probability that the system fails on a randomly selected demand is simply pA × pB. That is, there is conditional independence between the events “A fails ” and “B is imperfect. ” The second
A PVS based framework for validating compiler optimizations
 In SEFM ’06: Proceedings of the Fourth IEEE International Conference on Software Engineering and Formal Methods
, 2006
"... Abstract An optimization can be specified as sequential compositions of predefined transformation primitives. For each primitive, we can define soundness conditions which guarantee that the transformation is semantics preserving. An optimization of a program preserves semantics, if all applicatio ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Abstract An optimization can be specified as sequential compositions of predefined transformation primitives. For each primitive, we can define soundness conditions which guarantee that the transformation is semantics preserving. An optimization of a program preserves semantics, if all applications of the primitives in the optimization satisfy their respective soundness conditions on the versions of the inputprogram on which they are applied. This scheme does not directly check semantic equivalence of the input and the optimized programs and is therefore amenable to automation. Automating this scheme however requires a trustedframework for simulating transformation primitives and checking their soundness conditions. In this paper, wepresent the design of such a framework based on PVS. We have used it for specifying and validating several optimizations viz. common subexpression elimination, optimal code placement, lazy code motion, loop invariant code motion,full and partial dead code elimination, etc. 1.
PVS Strategies for Proving Abstraction Properties of Automata
 STRATEGIES 2004 PRELIMINARY VERSION
, 2004
"... ..."
Interpretation of locales in Isabelle: Managing dependencies between locales
, 2006
"... Locales are the theory development modules of the Isabelle proof assistant. Interpretation is a powerful technique of theorem reuse which facilitates their automatic transport to other contexts. This paper is concerned with the interpretation of locales in the context of other locales. Our main conc ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Locales are the theory development modules of the Isabelle proof assistant. Interpretation is a powerful technique of theorem reuse which facilitates their automatic transport to other contexts. This paper is concerned with the interpretation of locales in the context of other locales. Our main concern is to make interpretation an effective tool in an interactive proof environment. Interpretation dependencies between locales are maintained explicitly, by means of a development graph, so that theorems proved in one locale can be propagated to other locales that interpret it. Proof tools in Isabelle are controlled by sets of default theorems they use. These sets are required to be finite, but can become infinite in the presence of arbitrary interpretations. We show that finiteness can be maintained.
Formalizing Metarouting in PVS
"... ... metarouting theory to aid the development of complex routing protocol models based on metarouting, which is an algebraic framework for specifying routing protocols in a restricted fashion such that the protocol is guaranteed to converge. Our formalization of metarouting theory utilizes the theor ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
... metarouting theory to aid the development of complex routing protocol models based on metarouting, which is an algebraic framework for specifying routing protocols in a restricted fashion such that the protocol is guaranteed to converge. Our formalization of metarouting theory utilizes the theoryinterpretation extensions of PVS. Our use of a general purpose theorem prover provides a structured framework for a network designer to incrementally develop and refine their algebraic routing protocol model by starting from various base routing algebras, and composing them into complex algebra models with composition operators. In addition, one can leverage PVS’s type checking capability and builtin proof engine to ensure routing model consistency.