Results 1  10
of
42
Orderenriched categorical models of the classical sequent calculus
 LECTURE AT INTERNATIONAL CENTRE FOR MATHEMATICAL SCIENCES, WORKSHOP ON PROOF THEORY AND ALGORITHMS
, 2003
"... It is wellknown that weakening and contraction cause naïve categorical models of the classical sequent calculus to collapse to Boolean lattices. Starting from a convenient formulation of the wellknown categorical semantics of linear classical sequent proofs, we give models of weakening and contra ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
It is wellknown that weakening and contraction cause naïve categorical models of the classical sequent calculus to collapse to Boolean lattices. Starting from a convenient formulation of the wellknown categorical semantics of linear classical sequent proofs, we give models of weakening and contraction that do not collapse. Cutreduction is interpreted by a partial order between morphisms. Our models make no commitment to any translation of classical logic into intuitionistic logic and distinguish nondeterministic choices of cutelimination. We show soundness and completeness via initial models built from proof nets, and describe models built from sets and relations.
Computation with classical sequents
 MATHEMATICAL STRUCTURES OF COMPUTER SCIENCE
, 2008
"... X is an untyped continuationstyle formal language with a typed subset which provides a CurryHoward isomorphism for a sequent calculus for implicative classical logic. X can also be viewed as a language for describing nets by composition of basic components connected by wires. These features make X ..."
Abstract

Cited by 16 (16 self)
 Add to MetaCart
X is an untyped continuationstyle formal language with a typed subset which provides a CurryHoward isomorphism for a sequent calculus for implicative classical logic. X can also be viewed as a language for describing nets by composition of basic components connected by wires. These features make X an expressive platform on which algebraic objects and many different (applicative) programming paradigms can be mapped. In this paper we will present the syntax and reduction rules for X and in order to demonstrate the expressive power of X, we will show how elaborate calculi can be embedded, like the λcalculus, Bloo and Rose’s calculus of explicit substitutions λx, Parigot’s λµ and Curien and Herbelin’s λµ ˜µ.
Lambda Terms for Natural Deduction, Sequent Calculus and Cut Elimination
"... It is wellknown that there is an isomorphism between natural deduction derivations and typed lambda terms. Moreover normalising these terms corresponds to eliminating cuts in the equivalent sequent calculus derivations. Several papers have been written on this topic. The correspondence between sequ ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
It is wellknown that there is an isomorphism between natural deduction derivations and typed lambda terms. Moreover normalising these terms corresponds to eliminating cuts in the equivalent sequent calculus derivations. Several papers have been written on this topic. The correspondence between sequent calculus derivations and natural deduction derivations is, however, not a oneone map, which causes some syntactic technicalities. The correspondence is best explained by two extensionally equivalent type assignment systems for untyped lambda terms, one corresponding to natural deduction (N) and the other to sequent calculus (L). These two systems constitute different grammars for generating the same (type assignment relation for untyped) lambda terms. The second grammar is ambiguous, but the first one is not. This fact explains the manyone correspondence mentioned above. Moreover, the second type assignment system has a `cutfree' fragment (L cf ). This fragment generates exactly the typeable lambda terms in normal form. The cut elimination theorem becomes a simple consequence of the fact that typed lambda terms posses a normal form.
From X to π; representing the classical sequent calculus
"... Abstract. We study the πcalculus, enriched with pairing and nonblocking input, and define a notion of type assignment that uses the type constructor →. We encode the circuits of the calculus X into this variant of π, and show that all reduction (cutelimination) and assignable types are preserved. ..."
Abstract

Cited by 12 (12 self)
 Add to MetaCart
(Show Context)
Abstract. We study the πcalculus, enriched with pairing and nonblocking input, and define a notion of type assignment that uses the type constructor →. We encode the circuits of the calculus X into this variant of π, and show that all reduction (cutelimination) and assignable types are preserved. Since X enjoys the CurryHoward isomorphism for Gentzen’s calculus LK, this implies that all proofs in LK have a representation in π.
Conservative extensions of the λcalculus for the computational interpretation of sequent calculus
, 2002
"... ..."
A logical interpretation of the λcalculus into the πcalculus, preserving spine reduction and types
, 2009
"... ..."
Completeness and Partial Soundness Results for Intersection & Union Typing for λµ ˜µ
 Annals of Pure and Applied Logic
"... This paper studies intersection and union type assignment for the calculus λµ ˜µ [17], a proofterm syntax for Gentzen’s classical sequent calculus, with the aim of defining a typebased semantics, via setting up a system that is closed under conversion. We will start by investigating what the minima ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
(Show Context)
This paper studies intersection and union type assignment for the calculus λµ ˜µ [17], a proofterm syntax for Gentzen’s classical sequent calculus, with the aim of defining a typebased semantics, via setting up a system that is closed under conversion. We will start by investigating what the minimal requirements are for a system for λµ ˜µ to be closed under subject expansion; this coincides with System M ∩ ∪ , the notion defined in [19]; however, we show that this system is not closed under subject reduction, so our goal cannot be achieved. We will then show that System M ∩ ∪ is also not closed under subjectexpansion, but can recover from this by presenting System M C as an extension of M ∩ ∪ (by adding typing rules) and showing that it satisfies subject expansion; it still lacks subject reduction. We show how to restrict M ∩ ∪ so that it satisfies subjectreduction as well by limiting the applicability to type assignment rules, but only when limiting reduction to (confluent) callbyname or callbyvalue reduction M ∩ ∪ ; in restricting the system, we sacrifice subject expansion. These results combined show that a sound and complete intersection and union type assignment system cannot be defined for λµ ˜µ with respect to full reduction.
Strong normalisation for a gentzenlike cutelimination procedure
 In TLCA
, 2001
"... Abstract. In this paper we introduce a cutelimination procedure for classical logic, which is both strongly normalising and consisting of local proof transformations. Traditional cutelimination procedures, including the one by Gentzen, are formulated so that they only rewrite neighbouring inferenc ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper we introduce a cutelimination procedure for classical logic, which is both strongly normalising and consisting of local proof transformations. Traditional cutelimination procedures, including the one by Gentzen, are formulated so that they only rewrite neighbouring inference rules; that is they use local proof transformations. Unfortunately, such local proof transformation, if defined naïvely, break the strong normalisation property. Inspired by work of Bloo and Geuvers concerning the λxcalculus, we shall show that a simple trick allows us to preserve this property in our cutelimination procedure. We shall establish this property using the recursive path ordering by Dershowitz.
Revisiting cutelimination: One difficult proof is really a proof
 RTA 2008
, 2008
"... Powerful proof techniques, such as logical relation arguments, have been developed for establishing the strong normalisation property of termrewriting systems. The first author used such a logical relation argument to establish strong normalising for a cutelimination procedure in classical logic. ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
Powerful proof techniques, such as logical relation arguments, have been developed for establishing the strong normalisation property of termrewriting systems. The first author used such a logical relation argument to establish strong normalising for a cutelimination procedure in classical logic. He presented a rather complicated, but informal, proof establishing this property. The difficulties in this proof arise from a quite subtle substitution operation. We have formalised this proof in the theorem prover Isabelle/HOL using the Nominal Datatype Package, closely following the first authors PhD. In the process, we identified and resolved a gap in one central lemma and a number of smaller problems in others. We also needed to make one informal definition rigorous. We thus show that the original proof is indeed a proof and that present automated proving technology is adequate for formalising such difficult proofs.
Strategic Computation and Deduction
, 2009
"... I'd like to conclude by emphasizing what a wonderful eld this is to work in. Logical reasoning plays such a fundamental role in the spectrum of intellectual activities that advances in automating logic will inevitably have a profound impact in many intellectual disciplines. Of course, these thi ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
I'd like to conclude by emphasizing what a wonderful eld this is to work in. Logical reasoning plays such a fundamental role in the spectrum of intellectual activities that advances in automating logic will inevitably have a profound impact in many intellectual disciplines. Of course, these things take time. We tend to be impatient, but we need some historical perspective. The study of logic has a very long history, going back at least as far as Aristotle. During some of this time not very much progress was made. It's gratifying to realize how much has been accomplished in the less than fty years since serious e orts to mechanize logic began.