Results 1  10
of
17
Hoare Logic and Auxiliary Variables
 Formal Aspects of Computing
, 1998
"... Auxiliary variables are essential for specifying programs in Hoare Logic. They are required to relate the value of variables in different states. However, the axioms and rules of Hoare Logic turn a blind eye to the rle of auxiliary variables. We stipulate a new structural rule for adjusting auxiliar ..."
Abstract

Cited by 38 (0 self)
 Add to MetaCart
Auxiliary variables are essential for specifying programs in Hoare Logic. They are required to relate the value of variables in different states. However, the axioms and rules of Hoare Logic turn a blind eye to the rle of auxiliary variables. We stipulate a new structural rule for adjusting auxiliary variables when strengthening preconditions and weakening postconditions. Courtesy of this new rule, Hoare Logic is adaptation complete, which benefits software reuse. This property is responsible for a number of improvements. Relative completeness follows uniformly from the Most General Formula property. Moreover, contrary to common belief, one can show that Hoare Logic subsumes VDM's operation decomposition rules in that every derivation in VDM can be naturally embedded in Hoare Logic. Furthermore, the new treatment leads to a significant simplification in the presentation for verification calculi dealing with more interesting features such as recursion or concurrency.
Hoare Logic and VDM: MachineChecked Soundness and Completeness Proofs
, 1998
"... Investigating soundness and completeness of verification calculi for imperative programming languages is a challenging task. Many incorrect results have been published in the past. We take advantage of the computeraided proof tool LEGO to interactively establish soundness and completeness of both H ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
Investigating soundness and completeness of verification calculi for imperative programming languages is a challenging task. Many incorrect results have been published in the past. We take advantage of the computeraided proof tool LEGO to interactively establish soundness and completeness of both Hoare Logic and the operation decomposition rules of the Vienna Development Method (VDM) with respect to operational semantics. We deal with parameterless recursive procedures and local variables in the context of total correctness. As a case study, we use LEGO to verify the correctness of Quicksort in Hoare Logic. As our main contribution, we illuminate the rle of auxiliary variables in Hoare Logic. They are required to relate the value of program variables in the final state with the value of program variables in the initial state. In our formalisation, we reflect their purpose by interpreting assertions as relations on states and a domain of auxiliary variables. Furthermore, we propose a new structural rule for adjusting auxiliary variables when strengthening preconditions and weakening postconditions. This rule is stronger than all previously suggested structural rules, including rules of adaptation. With the new treatment, we are able to show that, contrary to common belief, Hoare Logic subsumes VDM in that every derivation in VDM can be naturally embedded in Hoare Logic. Moreover, we establish completeness results uniformly as corollaries of Most General Formula theorems which remove the need to reason about arbitrary assertions.
Mechanical Proofs about Computer Programs
, 1984
"... The Gypsy verification environment is a large computer program that supports the development of software systems and formal, mathematical proofs about their behavior. The environment provides conventional development tools, such as a parser for the Gypsy language, an editor and a compiler. These are ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
The Gypsy verification environment is a large computer program that supports the development of software systems and formal, mathematical proofs about their behavior. The environment provides conventional development tools, such as a parser for the Gypsy language, an editor and a compiler. These are used to evolve a library of components that define both the software and precise specifications about its desired behavior. The environment also has a verification condition generator that automatically transforms a software component and its specification into logical formulas which are sufficient to prove that the component always runs according to specification. Facilities for constructing formal, mechanical proofs of these formulas also are provided. Many of these proofs are completed automatically without human intervention. The capabilities of the Gypsy system and the results of its applications are discussed.
Monitor Classification
, 1995
"... One of the most natural, elegant, and efficient mechanisms for synchronization and communication, especially for systems with shared memory, is the monitor. Over the past twenty years many kinds of monitors have been proposed and implemented, and many modern programming languages provide some form o ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
One of the most natural, elegant, and efficient mechanisms for synchronization and communication, especially for systems with shared memory, is the monitor. Over the past twenty years many kinds of monitors have been proposed and implemented, and many modern programming languages provide some form of monitor for concurrency control. This paper presents a taxonomy of monitors that encompasses all the extant monitors and suggests others not found in the literature or in existing programming languages. It discusses the semantics and performance of the various kinds of monitors suggested by the taxonomy, and it discusses programming techniques suitable to each. Categories and Subject Descriptors: D.1.3 [Programming Techniques]: Concurrent Programming; D.3.3 [Pro gramming Languages]: Language Constructs and Featuresconcurrent programming structures, control structures; D.4.1 [Operating Systems]: Process Managementconcurrency, mutual exclusion, scheduling, synchronization; Performan...
Modeling Sequences within the RelView System
 Journal of Universal Computer Science
, 2000
"... this paper a simple relationalgebraic model for sequences via binary direct sums which especially works for the relational manipulation and prototyping system RelView (cf. [3, 1]) and show a typical application. ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
this paper a simple relationalgebraic model for sequences via binary direct sums which especially works for the relational manipulation and prototyping system RelView (cf. [3, 1]) and show a typical application.
Spill  a Logic Language for Writing Testable Requirements Specifications
, 1997
"... A requirements specification is the first formal description of a program. Formal methods of program construction can be practically useful only when the requirements specification can be shown to be adequate. This must be done by informal means: inspection and testing. Current specification languag ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
A requirements specification is the first formal description of a program. Formal methods of program construction can be practically useful only when the requirements specification can be shown to be adequate. This must be done by informal means: inspection and testing. Current specification languages do not easily support both inspection and testing. We propose a specification language, Spill, which has been designed with the express purpose of providing such support. Our language is based on the ideas of logic programming, and can be thought of as both an extended and a restricted version of pure Prolog. A specification written in Spill can be read as a declarative, precise description of the properties of the specified object. The description can be used as a starting point in the formal derivation of a program. At the same time the specification is testable  it can be treated as a program that allows the user to test whether the object so described would indeed have the desired ...
The Derivation of a Tighter Bound for TopDown Skew Heaps
, 1991
"... In this paper we present and analyze functional programs for a number of priority queue operations. These programs are based upon the topdown skew heapsa truly elegant data structuredesigned by D.D. Sleator and R.E. Tarjan. We show how their potential technique can be used to determine the tim ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
In this paper we present and analyze functional programs for a number of priority queue operations. These programs are based upon the topdown skew heapsa truly elegant data structuredesigned by D.D. Sleator and R.E. Tarjan. We show how their potential technique can be used to determine the time complexity of functional programs. This functional approach enables us to derive a potential function leading to tighter bounds for the amortized costs of the priority queue operations. From the improved bounds it follows, for instance, that Skewsort, a simple sorting program using these operations, requires only about 1:44Nlog 2 N comparisons to sort N numbers (in the worst case). 1 Amortized complexity in a functional setting By means of a simple example we explain how the potential technique of Sleator and Tarjan [7] can be used to determine the time complexity of functional programs. In this example lists of zeros and ones are used as binary representations of natural numbers. We deno...
RelationAlgebraic Derivation of Spanning Tree Algorithms
 Proc. MPC '98, LNCS 1422
, 1998
"... . We use Tarski's relational algebra to derive a series of algorithms for computing spanning trees of undirected graphs, including a variant of Prim's minimum spanning tree algorithm. 1 Introduction The relational calculus has been very successful in the derivation and proof of algorithms for direc ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
. We use Tarski's relational algebra to derive a series of algorithms for computing spanning trees of undirected graphs, including a variant of Prim's minimum spanning tree algorithm. 1 Introduction The relational calculus has been very successful in the derivation and proof of algorithms for directed graphs. We claim that it is equally suitable for reasoning about undirected and even about weighted graphs. To prove our point we derive a series of increasingly powerful spanning tree algorithms which culminates in a variant of Prim's wellknown algorithm for computing a spanning tree with minimal weight. Directed graphs and relations are essentially the same, but there are (at least) two natural ways of representing undirected graphs as relations. The first possibility are symmetric relations on vertices, also known as adjacence relations. This representation has the advantage of simplicity; it is well suited for calculations. Alternatively we can use incidence relations between the se...
Separation of Correctness and Complexity in Algorithm Design
, 1993
"... this paper we propose a new approach to the design of algorithms. This approach is based on the view that all algorithms are composed of a computation and a control component, and that these components can be designed separately. The computation component is responsible for the correctness of an alg ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
this paper we propose a new approach to the design of algorithms. This approach is based on the view that all algorithms are composed of a computation and a control component, and that these components can be designed separately. The computation component is responsible for the correctness of an algorithm. It embodies the computational knowledge about a problem domain that is needed to solve the corresponding problem. The control component governs complexity aspects of the solution method by directing the usage of the computational knowledge. This bisection allows the problem of how to construct an algorithm to be split into two smaller problems: "What are the elementary units of computational knowledge for the problem at hand?" and "In what order should these units be used to (efficiently) obtain a solution." This way, the concerns of correctness and complexity are separated. We assert that there are many advantages in separating correctness and complexity in algorithm design. Hence, we examine properties of programming formalisms that influence the possibility of separating the design of the computation and control components of an algorithm.
A mechanicallychecked correctness proof of a floatingpoint search program
, 1990
"... representing the official policies, either expressed or implied, of ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
representing the official policies, either expressed or implied, of