Results 1  10
of
38
Using firstorder logic to reason about policies
 In Proc. 16th IEEE Computer Security Foundations Workshop (CSFW’03
, 2003
"... A policy describes the conditions under which an action is permitted or forbidden. We show that a fragment of (multisorted) firstorder logic can be used to represent and reason about policies. Because we use firstorder logic, policies have a clear syntax and semantics. We show that further restri ..."
Abstract

Cited by 76 (5 self)
 Add to MetaCart
A policy describes the conditions under which an action is permitted or forbidden. We show that a fragment of (multisorted) firstorder logic can be used to represent and reason about policies. Because we use firstorder logic, policies have a clear syntax and semantics. We show that further restricting the fragment results in a language that is still quite expressive yet is also tractable. More precisely, questions about entailment, such as ‘May Alice access the file?’, can be answered in time that is a loworder polynomial (indeed, almost linear in some cases), as can questions about the consistency of policy sets.
NORA/HAMMR: Making DeductionBased Software Component Retrieval Practical
, 1997
"... Deductionbased software component retrieval uses preand postconditions as indexes and search keys and an automated theorem prover (ATP) to check whether a component matches. This idea is very simple but the vast number of arising proof tasks makes a practical implementation very hard. We thus pass ..."
Abstract

Cited by 39 (4 self)
 Add to MetaCart
Deductionbased software component retrieval uses preand postconditions as indexes and search keys and an automated theorem prover (ATP) to check whether a component matches. This idea is very simple but the vast number of arising proof tasks makes a practical implementation very hard. We thus pass the components through a chain of filters of increasing deductive power. In this chain, rejection filters based on signature matching and model checking techniques are used to rule out nonmatches as early as possible and to prevent the subsequent ATP from "drowning." Hence, intermediate results of reasonable precision are available at (almost) any time of the retrieval process. The final ATP step then works as a confirmation filter to lift the precision of the answer set. We implemented a chain which runs fully automatically and uses MACE for model checking and the automated prover SETHEO as confirmation filter. We evaluated the system over a mediumsized collection of components. The resul...
Rewrite Techniques for Transitive Relations
 IN PROC., 9TH IEEE SYMPOSIUM ON LOGIC IN COMPUTER SCIENCE
, 1994
"... We propose inference systems for dealing with transitive relations in the context of resolutiontype theorem proving. These inference mechanisms are based on standard techniques from term rewriting and represent a refinement of chaining methods. We establish their refutational completeness and al ..."
Abstract

Cited by 37 (5 self)
 Add to MetaCart
We propose inference systems for dealing with transitive relations in the context of resolutiontype theorem proving. These inference mechanisms are based on standard techniques from term rewriting and represent a refinement of chaining methods. We establish their refutational completeness and also prove their compatibility with the usual simplification techniques used in rewritebased theorem provers. A key to the practicality of chaining techniques is the extent to which socalled variable chainings can be restricted. We demonstrate that rewrite techniques considerably restrict variable chaining, though we also show that they cannot be completely avoided for transitive relations in general. If the given relation satisfies additional properties, such as symmetry, further restrictions are possible. In particular, we discuss (partial) equivalence relations and congruence relations.
Rewrite Methods for Clausal and Nonclausal Theorem Proving
 in Proceedings of the Tenth International Conference on Automata, Languages and Programming
, 1983
"... Effective theorem provers are essential for automatic verification and generation of programs. The conventional resolution strategies, albeit complete, are inefficient. On the other hand, special purpose methods, such as term rewriting systems for solving word problems, are relatively efficient but ..."
Abstract

Cited by 21 (10 self)
 Add to MetaCart
Effective theorem provers are essential for automatic verification and generation of programs. The conventional resolution strategies, albeit complete, are inefficient. On the other hand, special purpose methods, such as term rewriting systems for solving word problems, are relatively efficient but applicable to only limited classes of problems. In this paper, a simple canonical set of rewrite rules for Boolean algebra is presented. Based on this set of rules, the notion of term rewriting systems is generalized to provide complete proof strategies for first order predicate calculus. The methods are conceptually simple and can frequently utilize lemmas in proofs. Moreover, when the variables of the predicates involve some domain that has a canonical system, that system can be incorporated as rewrite rules, with the algebraic simplifications being done simultaneously with the merging of clauses. This feature is particularly useful in program verification, data type specification, and programming language design, where axioms can be expressed as equations (rewrite rules). Preliminary results from our implementation indicate that the methods are spaceefficient with respect to the number of rules generated (as compared to the number of resolvents in resolution provers). 2.
Computing finite models by reduction to functionfree clause logic
 Journal of Applied Logic
, 2007
"... Recent years have seen considerable interest in procedures for computing finite models of firstorder logic specifications. One of the major paradigms, MACEstyle model building, is based on reducing model search to a sequence of propositional satisfiability problems and applying (efficient) SAT sol ..."
Abstract

Cited by 21 (5 self)
 Add to MetaCart
Recent years have seen considerable interest in procedures for computing finite models of firstorder logic specifications. One of the major paradigms, MACEstyle model building, is based on reducing model search to a sequence of propositional satisfiability problems and applying (efficient) SAT solvers to them. A problem with this method is that it does not scale well because the propositional formulas to be considered may become very large. We propose instead to reduce model search to a sequence of satisfiability problems consisting of functionfree firstorder clause sets, and to apply (efficient) theorem provers capable of deciding such problems. The main appeal of this method is that firstorder clause sets grow more slowly than their propositional counterparts, thus allowing for more space efficient reasoning. In this paper we describe our proposed reduction in detail and discuss how it is integrated into the Darwin prover, our implementation of the Model Evolution calculus. The results are general, however, as our approach can be used in principle with any system that decides the satisfiability of functionfree firstorder clause sets. To demonstrate its practical feasibility, we tested our approach on all satisfiable problems from the TPTP library. Our methods can solve a significant subset of these problems, which overlaps but is not included in the subset of problems solvable by stateoftheart finite model builders such as Paradox and Mace4.
Ordered Chainings for Total Orderings
, 1995
"... We design new inference systems for total orderings by applying rewrite techniques to chaining calculi. Equality relations may either be specified axiomatically or built into the deductive calculus via paramodulation or superposition. We demonstrate that our inference systems are compatible with ..."
Abstract

Cited by 21 (5 self)
 Add to MetaCart
We design new inference systems for total orderings by applying rewrite techniques to chaining calculi. Equality relations may either be specified axiomatically or built into the deductive calculus via paramodulation or superposition. We demonstrate that our inference systems are compatible with a concept of (global) redundancy for clauses and inferences that covers such widely used simplification techniques as tautology deletion, subsumption, and demodulation. A key to the practicality of chaining techniques is the extent to which socalled variable chainings can be restricted. Syntactic ordering restrictions on terms and the rewrite techniques which account for their completeness considerably restrict variable chaining. We show that variable elimination is an admissible simplification techniques within our redundancy framework, and that consequently for dense total orderings without endpoints no variable chaining is needed at all.
Deduction Systems Based on Resolution
, 1991
"... A general theory of deduction systems is presented. The theory is illustrated with deduction systems based on the resolution calculus, in particular with clause graphs. This theory distinguishes four constituents of a deduction system: ffl the logic, which establishes a notion of semantic entailmen ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
A general theory of deduction systems is presented. The theory is illustrated with deduction systems based on the resolution calculus, in particular with clause graphs. This theory distinguishes four constituents of a deduction system: ffl the logic, which establishes a notion of semantic entailment; ffl the calculus, whose rules of inference provide the syntactic counterpart of entailment; ffl the logical state transition system, which determines the representation of formulae or sets of formulae together with their interrelationships, and also may allow additional operations reducing the search space; ffl the control, which comprises the criteria used to choose the most promising from among all applicable inference steps. Much of the standard material on resolution is presented in this framework. For the last two levels many alternatives are discussed. Appropriately adjusted notions of soundness, completeness, confluence, and Noetherianness are introduced in order to characterize...
Special Cases and Substitutes for Rigid EUnification
, 1995
"... The simultaneous rigid Eunification problem arises naturally in theorem proving with equality. This problem has recently been shown to be undecidable. This raises the question whether simultaneous rigid Eunification can usefully be applied to equality theorem proving. We give some evidence in th ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
The simultaneous rigid Eunification problem arises naturally in theorem proving with equality. This problem has recently been shown to be undecidable. This raises the question whether simultaneous rigid Eunification can usefully be applied to equality theorem proving. We give some evidence in the affirmative, by presenting a number of common special cases in which a decidable version of this problem suffices for theorem proving with equality. We also present some general decidable methods of a rigid nature that can be used for equality theorem proving and discuss their complexity. Finally, we give a new proof of undecidability of simultaneous rigid Eunification which is based on Post's Correspondence Problem, and has the interesting feature that all the positive equations used are ground equations (that is, contain no variables). Contents 1 Introduction 2 2 Paths and Spanning Sets 2 3 Critical Pairs and Rigid EUnification 4 3.1 NPCompleteness of Rigid EUnification : : :...