Results 1 
9 of
9
Categorial Type Logics
 Handbook of Logic and Language
, 1997
"... Contents 1 Introduction: grammatical reasoning 1 2 Linguistic inference: the Lambek systems 5 2.1 Modelinggrammaticalcomposition ............................ 5 2.2 Gentzen calculus, cut elimination and decidability . . . . . . . . . . . . . . . . . . . . 9 2.3 Discussion: options for resource mana ..."
Abstract

Cited by 239 (5 self)
 Add to MetaCart
Contents 1 Introduction: grammatical reasoning 1 2 Linguistic inference: the Lambek systems 5 2.1 Modelinggrammaticalcomposition ............................ 5 2.2 Gentzen calculus, cut elimination and decidability . . . . . . . . . . . . . . . . . . . . 9 2.3 Discussion: options for resource management . . . . . . . . . . . . . . . . . . . . . . 13 3 The syntaxsemantics interface: proofs and readings 16 3.1 Term assignment for categorial deductions . . . . . . . . . . . . . . . . . . . . . . . . 17 3.2 Natural language interpretation: the deductive view . . . . . . . . . . . . . . . . . . . 21 4 Grammatical composition: multimodal systems 26 4.1 Mixedinference:themodesofcomposition........................ 26 4.2 Grammaticalcomposition:unaryoperations ....................... 30 4.2.1 Unary connectives: logic and structure . . . . . . . . . . . . . . . . . . . . . . . 31 4.2.2 Applications: imposing constraints, structural relaxation
A Computational Interpretation of Modal Proofs
 Proof Theory of Modal Logics
, 1994
"... The usual (e.g. Prawitz's) treatment of natural deduction for modal logics involves a complicated rule for the introduction of the necessity, since the naive one does not allow normalization. We propose natural deduction systems for the positive fragments of the modal logics K, K4, KT, and S4, exten ..."
Abstract

Cited by 28 (2 self)
 Add to MetaCart
The usual (e.g. Prawitz's) treatment of natural deduction for modal logics involves a complicated rule for the introduction of the necessity, since the naive one does not allow normalization. We propose natural deduction systems for the positive fragments of the modal logics K, K4, KT, and S4, extending previous work by Masini on a twodimensional generalization of Gentzen's sequents (2sequents). The modal rules closely match the standard rules for an universal quantifier and different logics are obtained with simple conditions on the elimination rule for 2. We give an explicit term calculus corresponding to proofs in these systems and, after defining a notion of reduction on terms, we prove its confluence and strong normalization. 1. Introduction Proof theory of modal logics, though largely studied since the fifties, has always been a delicate subject, the main reason being the apparent impossibility to obtain elegant, natural systems for intensional operators (with the excellent ex...
Residual theory in λcalculus: A formal development
 Journal of Functional Programming
, 1994
"... Abstract. We present the complete development, in Gallina, of the residual theory of βreduction in pure λcalculus. The main result is the Prism Theorem, and its corollary Lévy’s Cube Lemma, a strong form of the parallelmoves lemma, itself a key step towards the confluence theorem and its usual co ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Abstract. We present the complete development, in Gallina, of the residual theory of βreduction in pure λcalculus. The main result is the Prism Theorem, and its corollary Lévy’s Cube Lemma, a strong form of the parallelmoves lemma, itself a key step towards the confluence theorem and its usual corollaries (ChurchRosser, uniqueness of normal forms). Gallina is the specification language of the Coq Proof Assistant[7, 11]. It is a specific concrete syntax for its abstract framework, the Calculus of Inductive Constructions[15]. It may be thought of as a smooth mixture of higherorder predicate calculus with recursive definitions, inductively defined datatypes, and inductive predicate definitions reminiscent of logic programming. The development presented here was fully checked in the current distribution version Coq V5.8. We just state the lemmas in the order in which they are proved, omitting the proof justifications. The full transcript is available as a standard library in the distribution of Coq. 1
The theory of calculi with explicit substitutions revisited
 CSL 2007
, 2007
"... Calculi with explicit substitutions (ES) are widely used in different areas of computer science. Complex systems with ES were developed these last 15 years to capture the good computational behaviour of the original systems (with metalevel substitutions) they were implementing. In this paper we fi ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Calculi with explicit substitutions (ES) are widely used in different areas of computer science. Complex systems with ES were developed these last 15 years to capture the good computational behaviour of the original systems (with metalevel substitutions) they were implementing. In this paper we first survey previous work in the domain by pointing out the motivations and challenges that guided the development of such calculi. Then we use very simple technology to establish a general theory of explicit substitutions for the lambdacalculus which enjoys fundamental properties such as simulation of onestep betareduction, confluence on metaterms, preservation of betastrong normalisation, strong normalisation of typed terms and full composition. The calculus also admits a natural translation into Linear Logic’s proofnets.
A proposal for broad spectrum proof certificates
"... Abstract. Recent developments in the theory of focused proof systems provide flexible means for structuring proofs within the sequent calculus. This structuring is organized around the construction of “macro” level inference rules based on the “micro ” inference rules which introduce single logical ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
Abstract. Recent developments in the theory of focused proof systems provide flexible means for structuring proofs within the sequent calculus. This structuring is organized around the construction of “macro” level inference rules based on the “micro ” inference rules which introduce single logical connectives. After presenting focused proof systems for firstorder classical logics (one with and one without fixed points and equality) we illustrate several examples of proof certificates formats that are derived naturally from the structure of such focused proof systems. In principle, a proof certificate contains two parts: the first part describes how macro rules are defined in terms of micro rules and the second part describes a particular proof object using the macro rules. The first part, which is based on the vocabulary of focused proof systems, describes a collection of macro rules that can be used to directly present the structure of proof evidence captured by a particular class of computational logic systems. While such proof certificates can capture a wide variety of proof structures, a proof checker can remain simple since it must only understand the microrules and the discipline of focusing. Since proofs and proof certificates are often likely to be large, there must be some flexibility in allowing proof certificates to elide subproofs: as a result, proof checkers will necessarily be required to perform (bounded) proof search in order to reconstruct missing subproofs. Thus, proof checkers will need to do unification and restricted backtracking search. 1
SUBSEXPL: A Framework for Simulating and Comparing Explicit Substitutions Calculi A Tutorial
, 2005
"... In this paper we present a framework, called SUBSEXPL, for simulating and comparing explicit substitutions calculi. This framework was developed in Ocaml, a language of the ML family, and it allows the manipulation of expressions of the λcalculus and of several styles of explicit substitutions calc ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper we present a framework, called SUBSEXPL, for simulating and comparing explicit substitutions calculi. This framework was developed in Ocaml, a language of the ML family, and it allows the manipulation of expressions of the λcalculus and of several styles of explicit substitutions calculi. Applications of this framework include: the visualisation of the contractions of the λcalculus, and of guided onestep reductions and normalisation via each of the associated substitution calculi. Many useful facilities are available: reductions can be easily recorded and stored into files, latex output and useful examples for dealing with, among other things, arithmetic operations and computational operators such as conditionals and repetitions in the λcalculus. The current implementation of SUBSEXPL includes treatment of three different calculi of explicit substitutions: the λσ, the λse and the suspension calculus; other explicit substitutions calculi can be easily incorporated into the system. An implementation of the ηreduction is provided for each of these explicit substitutions calculi. This system has been of great help for systematically comparing explicit substitutions calculi, as well as for understanding properties of explicit substitutions such as the Preservation of Strong Normalisation. In addition, it has been used for teaching basic properties of the λcalculus such as: computational adequacy, the importance of de Bruijn’s notation and of making explicit substitutions in real implementations based on the λcalculus. Keywords: λCalculus, Explicit Substitutions, Visualisation of β and ηContraction and Normalisation. 1
Author manuscript, published in "CPP 2011 First International Conference on Certified Proofs and Programs (2011)" A proposal for broad spectrum proof certificates
, 2013
"... Abstract. Recent developments in the theory of focused proof systems provide flexible means for structuring proofs within the sequent calculus. This structuring is organized around the construction of “macro” level inference rules based on the “micro ” inference rules which introduce single logical ..."
Abstract
 Add to MetaCart
Abstract. Recent developments in the theory of focused proof systems provide flexible means for structuring proofs within the sequent calculus. This structuring is organized around the construction of “macro” level inference rules based on the “micro ” inference rules which introduce single logical connectives. After presenting focused proof systems for firstorder classical logics (one with and one without fixed points and equality) we illustrate several examples of proof certificates formats that are derived naturally from the structure of such focused proof systems. In principle, a proof certificate contains two parts: the first part describes how macro rules are defined in terms of micro rules and the second part describes a particular proof object using the macro rules. The first part, which is based on the vocabulary of focused proof systems, describes a collection of macro rules that can be used to directly present the structure of proof evidence captured by a particular class of computational logic systems. While such proof certificates can capture a wide variety of proof structures, a proof checker can remain simple since it must only understand the microrules and the discipline of focusing. Since proofs and proof certificates are often likely to be large, there must be some flexibility in allowing proof certificates to elide subproofs: as a result, proof checkers will necessarily be required to perform (bounded) proof search in order to reconstruct missing subproofs. Thus, proof checkers will need to do unification and restricted backtracking search. 1
Comments welcome! See webpage for future updates and mailing list:
, 2013
"... Elements of Formal Semantics will aim at introducing some of the foundational concepts, principles and techniques in formal semantics of natural language. It is planned as a basic but sophisticated introduction for readers who have some elementary background in set theory and linguistics. No mastery ..."
Abstract
 Add to MetaCart
Elements of Formal Semantics will aim at introducing some of the foundational concepts, principles and techniques in formal semantics of natural language. It is planned as a basic but sophisticated introduction for readers who have some elementary background in set theory and linguistics. No mastery of logic, advanced math, or theoretical linguistics is presupposed. The book will bring central semantic concepts and tools to the forefront, without letting them be complicated by detailed empirical discussions, logical background or specifics of semantic theories. By motivating the introduction of each basic tool and concept by the analysis of concrete examples in English, the book attempts to attract the reader’s attention to the beauty of capturing intricate semantic phenomena using elegant and rigorouslydefined mathematical