Results 1  10
of
11
A Framework for Defining Logics
 JOURNAL OF THE ASSOCIATION FOR COMPUTING MACHINERY
, 1993
"... The Edinburgh Logical Framework (LF) provides a means to define (or present) logics. It is based on a general treatment of syntax, rules, and proofs by means of a typed calculus with dependent types. Syntax is treated in a style similar to, but more general than, MartinLof's system of ariti ..."
Abstract

Cited by 716 (39 self)
 Add to MetaCart
The Edinburgh Logical Framework (LF) provides a means to define (or present) logics. It is based on a general treatment of syntax, rules, and proofs by means of a typed calculus with dependent types. Syntax is treated in a style similar to, but more general than, MartinLof's system of arities. The treatment of rules and proofs focuses on his notion of a judgement. Logics are represented in LF via a new principle, the judgements as types principle, whereby each judgement is identified with the type of its proofs. This allows for a smooth treatment of discharge and variable occurrence conditions and leads to a uniform treatment of rules and proofs whereby rules are viewed as proofs of higherorder judgements and proof checking is reduced to type checking. The practical benefit of our treatment of formal systems is that logicindependent tools such as proof editors and proof checkers can be constructed.
Using Typed Lambda Calculus to Implement Formal Systems on a Machine
 Journal of Automated Reasoning
, 1992
"... this paper and the LF. In particular the idea of having an operator T : Prop ! Type appears already in De Bruijn's earlier work, as does the idea of having several judgements. The paper [24] describes the basic features of the LF. In this paper we are going to provide a broader illustration of ..."
Abstract

Cited by 85 (14 self)
 Add to MetaCart
(Show Context)
this paper and the LF. In particular the idea of having an operator T : Prop ! Type appears already in De Bruijn's earlier work, as does the idea of having several judgements. The paper [24] describes the basic features of the LF. In this paper we are going to provide a broader illustration of its applicability and discuss to what extent it is successful. The analysis (of the formal presentation) of a system carried out through encoding often illuminates the system itself. This paper will also deal with this phenomenon.
Inferring the Equivalence of Functional Programs that Mutate Data
 Theoretical Computer Science
, 1992
"... this paper we study the constrained equivalence of programs with effects. In particular, we present a formal system for deriving such equivalences. Constrained equivalence is defined via a model theoretic characterization of operational, or observational, equivalence called strong isomorphism. Opera ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
(Show Context)
this paper we study the constrained equivalence of programs with effects. In particular, we present a formal system for deriving such equivalences. Constrained equivalence is defined via a model theoretic characterization of operational, or observational, equivalence called strong isomorphism. Operational equivalence, as introduced by Morris [23] and Plotkin [27], treats programs as black boxes. Two expressions are operationally equivalent if they are indistinguishable in all program contexts. This equivalence is the basis for soundness results for program calculi and program transformation theories. Strong isomorphism, as introduced by Mason [14], also treats programs as black boxes. Two expressions are strongly isomorphic if in all memory states they return the same value, and have the same effect on memory (modulo the production of garbage). Strong isomorphism implies operational equivalence. The converse is true for firstorder languages; it is false for full higherorder languages. However, even in the higherorder case, it remains an useful tool for establishing equivalence. Since strong isomorphism is defined by quantifying over memory states, rather than program contexts, it is a simple matter to restrict this equivalence to those memory states which satisfy a set of constraints. It is for this reason that strong isomorphism is a useful relation, even in the higherorder case. The formal system we present defines a singleconclusion consequence relation \Sigma ` \Phi where \Sigma is a finite set of constraints and \Phi is an assertion. The semantics of the formal system is given by a semantic consequence relation, \Sigma j= \Phi, defined in terms of a class of memory models for assertions and constraints. The assertions we consider are of the following two forms...
A Programming Logic for Java Bytecode Programs
 In Proceedings of the 16th International Conference on Theorem Proving in Higher Order LOglCS, volume 2758 of Lecture Notes in Computer Science
, 2003
"... A copy can be downloaded for personal noncommercial research or study, without prior permission or charge This thesis cannot be reproduced or quoted extensively from without first obtaining permission in writing from the Author The content must not be changed in any way or sold commercially in any ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
A copy can be downloaded for personal noncommercial research or study, without prior permission or charge This thesis cannot be reproduced or quoted extensively from without first obtaining permission in writing from the Author The content must not be changed in any way or sold commercially in any format or medium without the formal permission of the Author When referring to this work, full bibliographic details including the author, title, awarding institution and date of the thesis must be given
Encoding Natural Semantics in Coq
 In Proc. AMAST, LNCS 936
, 1995
"... . We address here the problem of automatically translating the Natural Semantics of programming languages to Coq, in order to prove formally general properties of languages. Natural Semantics [18] is a formalism for specifying semantics of programming languages inspired by Plotkin's Structural ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
. We address here the problem of automatically translating the Natural Semantics of programming languages to Coq, in order to prove formally general properties of languages. Natural Semantics [18] is a formalism for specifying semantics of programming languages inspired by Plotkin's Structural Operational Semantics [22]. The Coq proof development system [12], based on the Calculus of Constructions extended with inductive types (CCind), provides mechanized support including tactics for building goaldirected proofs. Our representation of a language in Coq is inAEuenced by the encoding of logics used by Church [6] and in the Edinburgh Logical Framework (ELF) [15, 3]. 1 Introduction The motivation for our work is the need for an environment to help develop proofs in Natural Semantics. The interactive programming environment generator Centaur [17] allows us to compile a Natural Semantics speciøcation of a given language into executable code (typecheckers, evaluators, compilers, program t...
Higherorder Representation of Substructural Logics
, 2009
"... We present a technique for higherorder representation of substructural logics such as linear or modal logic. We show that such logics can be encoded in the (ordinary) Logical Framework, without any linear or modal extensions. Using this encoding, metatheoretic proofs about such logics can easily be ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
We present a technique for higherorder representation of substructural logics such as linear or modal logic. We show that such logics can be encoded in the (ordinary) Logical Framework, without any linear or modal extensions. Using this encoding, metatheoretic proofs about such logics can easily be developed in the Twelf proof assistant.
AlgorithmIndependent Framework for Verifying Integer Constraints
, 2000
"... Proofcarrying code (PCC), as pioneered by Necula and Lee, allows a code producer to provide a compiled program to a host, along with a formal proof of safety. The PCCbased systems often rely on solving integer constraints to prove the soundness of the index types and to control resource consumptio ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Proofcarrying code (PCC), as pioneered by Necula and Lee, allows a code producer to provide a compiled program to a host, along with a formal proof of safety. The PCCbased systems often rely on solving integer constraints to prove the soundness of the index types and to control resource consumption. Unfortunately, existing approaches often require the inclusion of an oraclelike constraints solver into the trusted computing base (TCB) or at least lock the safety policy with one particular solver. This paper presents a feasibility study for dissociating the constraints solver from the TCB and the safety policy from the actual solver algorithm. To demonstrate this, we produce a simple framework, we show how to adapt the popular solvers such as the Omega test and the Simplex method into this framework and we study some of its properties.
Witnessing Purity, Constancy and Mutability
"... Abstract. Restricting destructive update to values of a distinguished reference type prevents functions from being polymorphic in the mutability of their arguments. This restriction makes it easier to reason about program behaviour during optimisation, but the lack of polymorphism reduces the expre ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Restricting destructive update to values of a distinguished reference type prevents functions from being polymorphic in the mutability of their arguments. This restriction makes it easier to reason about program behaviour during optimisation, but the lack of polymorphism reduces the expressiveness of the language. We show how to use type class style constraints to reason about destructive update in a language that supports mutability polymorphism in addition to mixed strict and lazy evaluation. We use our type system to guide optimisation, and to ensure that only computations without visible side effects are suspended. 1
Witnessing Purity, Constancy and Mutability
"... Abstract. Restricting destructive update to values of a distinguished reference type prevents functions from being polymorphic in the mutability of their arguments. This restriction makes it easier to reason about program behaviour during transformation, but the lack of polymorphism reduces the exp ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Restricting destructive update to values of a distinguished reference type prevents functions from being polymorphic in the mutability of their arguments. This restriction makes it easier to reason about program behaviour during transformation, but the lack of polymorphism reduces the expressiveness of the language. We present a SystemF style core language that uses dependently kinded proof witnesses to encode information about the mutability of values and the purity of computations. We support mixed strict and lazy evaluation, and use our type system to ensure that only computations without visible side effects are suspended. 1