Results 1  10
of
14
A Proof Theory for Generic Judgments
, 2003
"... this paper, we do this by adding the #quantifier: its role will be to declare variables to be new and of local scope. The syntax of the formula # x.B is like that for the universal and existential quantifiers. Following Church's Simple Theory of Types [Church 1940], formulas are given the type ..."
Abstract

Cited by 61 (14 self)
 Add to MetaCart
this paper, we do this by adding the #quantifier: its role will be to declare variables to be new and of local scope. The syntax of the formula # x.B is like that for the universal and existential quantifiers. Following Church's Simple Theory of Types [Church 1940], formulas are given the type o, and for all types # not containing o, # is a constant of type (# o) o. The expression # #x.B is ACM Transactions on Computational Logic, Vol. V, No. N, October 2003. 4 usually abbreviated as simply # x.B or as if the type information is either simple to infer or not important
Model checking for πcalculus using proof search
 CONCUR, volume 3653 of LNCS
, 2005
"... Abstract. Model checking for transition systems specified in πcalculus has been a difficult problem due to the infinitebranching nature of input prefix, namerestriction and scope extrusion. We propose here an approach to model checking for πcalculus by encoding it into a logic which supports rea ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
Abstract. Model checking for transition systems specified in πcalculus has been a difficult problem due to the infinitebranching nature of input prefix, namerestriction and scope extrusion. We propose here an approach to model checking for πcalculus by encoding it into a logic which supports reasoning about bindings and fixed points. This logic, called F Oλ ∆ ∇ , is a conservative extension of Church’s Simple Theory of Types with a “generic ” quantifier. By encoding judgments about transitions in picalculus into this logic, various conditions on the scoping of names and restrictions on name instantiations are captured naturally by the quantification theory of the logic. Moreover, standard implementation techniques for (higherorder) logic programming are applicable for implementing proof search for this logic, as illustrated in a prototype implementation discussed in this paper. The use of logic variables and eigenvariables in the implementation allows for exploring the state space of processes in a symbolic way. Compositionality of properties of the transitions is a simple consequence of the meta theory of the logic (i.e., cut elimination). We illustrate the benefits of specifying systems in this logic by studying several specifications of modal logics for picalculus. These specifications are also executable directly in the prototype implementation of F Oλ ∆ ∇. 1
Combining generic judgments with recursive definitions
 in "23th Symp. on Logic in Computer Science", F. PFENNING (editor), IEEE Computer Society Press, 2008, p. 33–44, http://www.lix.polytechnique.fr/Labo/Dale.Miller/papers/lics08a.pdf US
"... Many semantical aspects of programming languages are specified through calculi for constructing proofs: consider, for example, the specification of structured operational semantics, labeled transition systems, and typing systems. Recent proof theory research has identified two features that allow di ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
Many semantical aspects of programming languages are specified through calculi for constructing proofs: consider, for example, the specification of structured operational semantics, labeled transition systems, and typing systems. Recent proof theory research has identified two features that allow direct, logicbased reasoning about such descriptions: the treatment of atomic judgments as fixed points (recursive definitions) and an encoding of binding constructs via generic judgments. However, the logics encompassing these two features have thus far treated them orthogonally. In particular, they have not contained the ability to form definitions of objectlogic properties that themselves depend on an intrinsic treatment of binding. We propose a new and simple integration of these features within an intuitionistic logic enhanced with induction over natural numbers and we show that the resulting logic is consistent. The pivotal part of the integration allows recursive definitions to define generic judgments in general and not just the simpler atomic judgments that are traditionally allowed. The usefulness of this logic is illustrated by showing how it can provide elegant treatments of objectlogic contexts that appear in proofs involving typing calculi and arbitrarily cascading substitutions in reducibility arguments.
Mixing finite success and finite failure in an automated prover
 In Proceedings of ESHOL’05: Empirically Successful Automated Reasoning in HigherOrder Logics, pages 79 – 98
, 2005
"... Abstract. The operational semantics and typing judgements of modern programming and specification languages are often defined using relations and proof systems. In simple settings, logic programming languages can be used to provide rather direct and natural interpreters for such operational semantic ..."
Abstract

Cited by 14 (7 self)
 Add to MetaCart
Abstract. The operational semantics and typing judgements of modern programming and specification languages are often defined using relations and proof systems. In simple settings, logic programming languages can be used to provide rather direct and natural interpreters for such operational semantics. More complex features of specifications such as names and their bindings, proof rules with negative premises, and the exhaustive enumeration of state spaces, all pose significant challenges to conventional logic programming systems. In this paper, we describe a simple architecture for the implementation of deduction systems that allows a specification to interleave between finite success and finite failure. The implementation techniques for this prover are largely common ones from higherorder logic programming, i.e., logic variables, (higherorder pattern) unification, backtracking (using streambased computation), and abstract syntax based on simply typed λterms. We present a particular instance of this prover’s architecture and its prototype implementation, Level 0/1, based on the dual interpretation of (finite) success and finite failure in proof search. We show how Level 0/1 provides a highlevel and declarative implementation of model checking and bisimulation checking for the (finite) πcalculus. 1
Relating StateBased and ProcessBased Concurrency through Linear Logic
, 2006
"... This paper has the purpose of reviewing some of the established relationships between logic and concurrency, and of exploring new ones. Concurrent and distributed systems are notoriously hard to get right. Therefore, following an approach that has proved highly beneficial for sequential programs, mu ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
This paper has the purpose of reviewing some of the established relationships between logic and concurrency, and of exploring new ones. Concurrent and distributed systems are notoriously hard to get right. Therefore, following an approach that has proved highly beneficial for sequential programs, much effort has been invested in tracing the foundations of concurrency in logic. The starting points of such investigations have been various idealized languages of concurrent and distributed programming, in particular the wellestablished statetransformation model inspired to Petri nets and multiset rewriting, and the prolific processbased models such as the πcalculus and other process algebras. In nearly all cases, the target of these investigations has been linear logic, a formal language that supports a view of formulas as consumable resources. In the first part of this paper, we review some of these interpretations of concurrent languages into linear logic. In the second part of the paper, we propose a completely new approach to understanding concurrent and distributed programming as a manifestation of logic, which yields a language that merges those two main paradigms of concurrency. Specifically, we present a new semantics for multiset rewriting founded on an alternative view of linear logic. The resulting interpretation is extended with a majority of linear connectives into the language of ωmultisets. This interpretation drops the distinction between multiset elements and rewrite rules, and considerably enriches the expressive power of standard multiset rewriting with embedded rules, choice, replication, and more. Derivations are now primarily viewed as open objects, and are closed only to examine intermediate rewriting states. The resulting language can also be interpreted as a process algebra. For example, a simple translation maps process constructors of the asynchronous πcalculus to rewrite operators, while the structural equivalence corresponds directly to logicallymotivated structural properties of ωmultisets (with one exception).
General structural operational semantics through categorical logic (Extended Abstract)
, 2008
"... Certain principles are fundamental to operational semantics, regardless of the languages or idioms involved. Such principles include rulebased definitions and proof techniques for congruence results. We formulate these principles in the general context of categorical logic. From this general formul ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
Certain principles are fundamental to operational semantics, regardless of the languages or idioms involved. Such principles include rulebased definitions and proof techniques for congruence results. We formulate these principles in the general context of categorical logic. From this general formulation we recover precise results for particular language idioms by interpreting the logic in particular categories. For instance, results for firstorder calculi, such as CCS, arise from considering the general results in the category of sets. Results for languages involving substitution and name generation, such as the πcalculus, arise from considering the general results in categories of sheaves and group actions. As an extended example, we develop a tyft/tyxtlike rule format for open bisimulation in the πcalculus.
Representing and reasoning with operational semantics
 In: Proceedings of the Joint International Conference on Automated Reasoning
, 2006
"... The operational semantics of programming and specification languages is often presented via inference rules and these can generally be mapped into logic programminglike clauses. Such logical encodings of operational semantics can be surprisingly declarative if one uses logics that directly account ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
The operational semantics of programming and specification languages is often presented via inference rules and these can generally be mapped into logic programminglike clauses. Such logical encodings of operational semantics can be surprisingly declarative if one uses logics that directly account for termlevel bindings and for resources, such as are found in linear logic. Traditional theorem proving techniques, such as unification and backtracking search, can then be applied to animate operational semantic specifications. Of course, one wishes to go a step further than animation: using logic to encode computation should facilitate formal reasoning directly with semantic specifications. We outline an approach to reasoning about logic specifications that involves viewing logic specifications as theories in an objectlogic and then using a metalogic to reason about properties of those objectlogic theories. We motivate the principal design goals of a particular metalogic that has been built for that purpose.
A User Guide to Bedwyr
, 2006
"... Some recent theoretical work in proof search has illustrated that it is possible to combine the following two computational principles into one computational logic. 1. A symmetric treatment of finite success and finite failure. This allows capturing both aspects of may and must behavior in operation ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Some recent theoretical work in proof search has illustrated that it is possible to combine the following two computational principles into one computational logic. 1. A symmetric treatment of finite success and finite failure. This allows capturing both aspects of may and must behavior in operational semantics and mixing model checking and logic programming. 2. Direct support for λtree syntax, as in λProlog, via termlevel λbinders, higherorder pattern unification, and the ∇quantifier. All these features have a clean proof theory. The combination of these features allow, for example, specifying rather declarative approaches to model checking syntactic expressions containing bindings. The Bedwyr system is intended as an implementation of these computational logic principles. Why the name Bedwyr? In the legend of King Arthur and the round table, several knights shared in the search for the holy grail. The name of one of them, Parsifal, is used for an INRIA team associated with the “Slimmer ” effort. Bedwyr was another one of those knights. Wikipedia (using the spelling “Bedivere”) mentions that Bedwyr appears in Monty Python and the Holy Grail where he is “portrayed as a master of the extremely odd logic in the ancient times, whom occasionally blunders. ” Bedwyr is a reimplementation and rethinking ∗ Support has been obtained for this work from the following sources: from INRIA through
Toward a General Theory of Names, Binding and Scope
, 2005
"... Highlevel formalisms for reasoning about names and binding such as de Bruijn indices, various flavors of higherorder abstract syntax, the Theory of Contexts, and nominal abstract syntax address only one relatively restrictive form of scoping: namely, unary lexical scoping, in which the scope of a ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Highlevel formalisms for reasoning about names and binding such as de Bruijn indices, various flavors of higherorder abstract syntax, the Theory of Contexts, and nominal abstract syntax address only one relatively restrictive form of scoping: namely, unary lexical scoping, in which the scope of a (single) bound name is a subtree of the abstract syntax tree (possibly with other subtrees removed due to shadowing). Many languages exhibit binding or renaming structure that does not fit this mold. Examples include binding transitions in the #calculus; unique identifiers in contexts, memory heaps, and XML documents; declaration scoping in modules and namespaces; anonymous identifiers in automata, type schemes, and Horn clauses; and pattern matching and mutual recursion constructs in functional languages. In these cases, it appears necessary to either rearrange the abstract syntax so that lexical scoping can be used, or revert to firstorder techniques. The purpose
A proof theoretic approach to operational semantics, in
 Proc. of the workshop on Algebraic Process Calculi: The First Twenty Five Years and Beyond
, 2005
"... Proof theory can be applied to the problem of specifying and reasoning about the operational semantics of process calculi. We overview some recent research in which λtree syntax is used to encode expressions containing bindings and sequent calculus is used to reason about operational semantics. The ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Proof theory can be applied to the problem of specifying and reasoning about the operational semantics of process calculi. We overview some recent research in which λtree syntax is used to encode expressions containing bindings and sequent calculus is used to reason about operational semantics. There are various benefits of this proof theoretic approach for the πcalculus: the treatment of bindings can be captured with no side conditions; bisimulation has a simple and natural specification in which the difference between bound input and bound output is characterized using difference quantifiers; various modal logics for mobility can be specified declaratively; and simple logic programminglike deduction involving subsets of secondorder unification provides immediate implementations of symbolic bisimulation. These benefits should extend to other process calculi as well. As partial evidence of this, a simple λtree syntax extension to the tyft/tyxt rule format for namebinding and namepassing is possible that allows one to conclude that (open) bisimilarity is a congruence. Key words: operational semantics, proof theoretic specifications, λtree syntax, rule formats, πcalculus A number of frameworks have been used to formalize the semantics of process calculi and, more generally, programming languages. For example, algebra, category theory, and I/O automata have been used to provide formal settings for not only specifying but also reasoning about the operational semantics of calculi and languages. In this note, we overview recent results in making use of proof theory to encode and reason about such operational semantics. By the term “proof theory ” we refer the study of proofs for logics, particularly in the style initiated by Gentzen. 1 Support for this work comes from INRIA through the “Equipes Associées ” Slimmer and from the ACI grants GEOCAL and Rossignol.