Results 11  20
of
56
Lectures on proof theory
 in Proc. Summer School in Logic, Leeds 67
, 1968
"... This is a survey of some of the principal developments in proof theory from its inception in the 1920s, at the hands of David Hilbert, up to the 1960s. Hilbert's aim was to use this as a tool in his nitary consistency program to eliminate the \actual in nite " in mathematics from proofs of purely ni ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
This is a survey of some of the principal developments in proof theory from its inception in the 1920s, at the hands of David Hilbert, up to the 1960s. Hilbert's aim was to use this as a tool in his nitary consistency program to eliminate the \actual in nite " in mathematics from proofs of purely nitary statements. One of the main approaches that turned out to be the most useful in pursuit of this program was that due to Gerhard Gentzen, in the 1930s, via his calculi of \sequents" and his CutElimination Theorem for them. Following that we trace how and why prima facie in nitary concepts, such as ordinals, and in nitary methods, such as the use of in nitely long proofs, gradually came to dominate prooftheoretical developments. In this rst lecture I will give anoverview of the developments in proof theory since Hilbert's initiative in establishing the subject in the 1920s. For this purpose I am following the rst part of a series of expository lectures that I gave for the Logic Colloquium `94 held in ClermontFerrand 2123 July 1994, but haven't published. The theme of my lectures there was that although Hilbert established his theory of proofs as a part of his foundational program and, for philosophical reasons whichwe shall get into, aimed to have it developed in a completely nitistic way, the actual work in proof theory This is the rst of three lectures that I delivered at the conference, Proof Theory: History
Types as graphs: Continuations in type logical grammar
, 2005
"... Using the programminglanguage concept of CONTINUATIONS, we propose a new, multimodal analysis of quantification in Type Logical Grammar. Our approach provides a geometric view of insitu quantification in terms of graphs, and motivates the limited use of empty antecedents in derivations. Just as c ..."
Abstract

Cited by 11 (8 self)
 Add to MetaCart
Using the programminglanguage concept of CONTINUATIONS, we propose a new, multimodal analysis of quantification in Type Logical Grammar. Our approach provides a geometric view of insitu quantification in terms of graphs, and motivates the limited use of empty antecedents in derivations. Just as continuations are the tool of choice for reasoning about evaluation order and side effects in programming languages, our system provides a principled, typelogical way to model evaluation order and side effects in natural language. We illustrate with an improved account of quantificational binding, weak crossover, whquestions, superiority, and polarity licensing.
Selfadjunctions and matrices
 Journal of Pure and Applied Algebra
"... It is shown that the multiplicative monoids of TemperleyLieb algebras are isomorphic to monoids of endomorphisms in categories where an endofunctor is adjoint to itself. Such a selfadjunction is found in a category whose arrows are matrices, and the functor adjoint to itself is based on the Kronec ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
It is shown that the multiplicative monoids of TemperleyLieb algebras are isomorphic to monoids of endomorphisms in categories where an endofunctor is adjoint to itself. Such a selfadjunction is found in a category whose arrows are matrices, and the functor adjoint to itself is based on the Kronecker product of matrices. Thereby one obtains a representation of braid groups in matrices, which, though different and presumably new, is related to the standard representation of braid groups in TemperleyLieb algebras. Mathematics Subject Classification (2000): 57M99, 20F36, 18A40 1
Functional pearl: I am not a number–I am a free variable
 In Proc. Haskell workshop
"... In this paper, we show how to manipulate syntax with binding using a mixed representation of names for free variables (with respect to the task in hand) and de Bruijn indices [5] for bound variables. By doing so, we retain the advantages of both representations: naming supports easy, arithmeticfree ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
In this paper, we show how to manipulate syntax with binding using a mixed representation of names for free variables (with respect to the task in hand) and de Bruijn indices [5] for bound variables. By doing so, we retain the advantages of both representations: naming supports easy, arithmeticfree manipulation of terms; de Bruijn indices eliminate the need for αconversion. Further, we have ensured that not only the user but also the implementation need never deal with de Bruijn indices, except within key basic operations. Moreover, we give a hierarchical representation for names which naturally reflects the structure of the operations we implement. Name choice is safe and straightforward. Our technology combines easily with an approach to syntax manipulation inspired by Huet’s ‘zippers’[10]. Without the ideas in this paper, we would have struggled to implement EPIGRAM [19]. Our example—constructing inductive elimination operators for datatype families—is but one of many where it proves invaluable.
Uniform Proofs and Natural Deductions
, 1994
"... . Using some routine properties ([zu], [po]) of the Prawitz translation f from (sequent calculus) derivations to (natural) deductions, and restricting ourselves to the language of firstorder hereditary Harrop formulae, we show (i) that f maps the simple uniform derivations of Miller et al onto the ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
. Using some routine properties ([zu], [po]) of the Prawitz translation f from (sequent calculus) derivations to (natural) deductions, and restricting ourselves to the language of firstorder hereditary Harrop formulae, we show (i) that f maps the simple uniform derivations of Miller et al onto the set of deductions in expanded normal form; and (ii) that f identifies two such derivations iff they differ only by the order in which conjunctions and universal formulae on the left are broken up, thus factoring through a bijection between the set of uniform proofs with backchaining and the set of deductions in expanded normal form. Thus, the logic programmer's restriction to the use of uniform proofs with backchaining is complete not merely w.r.t. derivability but also (in a bijective fashion) w.r.t. the construction of expanded normal deductions. (extended abstract, April 14, 1994 . Caution: at present the results are not yet established to our satisfaction nor do we know the precise conte...
Proof Search in Constructive Logics
 In Sets and proofs
, 1998
"... We present an overview of some sequent calculi organised not for "theoremproving" but for proof search, where the proofs themselves (and the avoidance of known proofs on backtracking) are objects of interest. The main calculus discussed is that of Herbelin [1994] for intuitionistic logic, which ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
We present an overview of some sequent calculi organised not for "theoremproving" but for proof search, where the proofs themselves (and the avoidance of known proofs on backtracking) are objects of interest. The main calculus discussed is that of Herbelin [1994] for intuitionistic logic, which extends methods used in hereditary Harrop logic programming; we give a brief discussion of some similar calculi for other logics. We also point to some related work on permutations in intuitionistic Gentzen sequent calculi that clarifies the relationship between such calculi and natural deduction. 1 Introduction It is widely held that ordinary logic programming is based on classical logic, with a Tarskistyle semantics (answering questions "What judgments are provable?") rather than a Heytingstyle semantics (answering questions like "What are the proofs, if any, of each judgment?"). If one adopts the latter style (equivalently, the BHK interpretation: see [35] for details) by regardi...
Relevant analytic tableaux
 Studia Logica
, 1979
"... you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, noncommercial use. Please contact the publisher regarding any further use of this work. Publisher contact inform ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, noncommercial use. Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at
Lightweight Formal Verification in Classroom Instruction of Reasoning about Functional Code
, 2009
"... In college courses dealing with material that requires mathematical rigor, the adoption of a machinereadable representation for formal arguments can be advantageous. Students can focus on a specific collection of constructs that are represented consistently. Examples and counterexamples can be eval ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
In college courses dealing with material that requires mathematical rigor, the adoption of a machinereadable representation for formal arguments can be advantageous. Students can focus on a specific collection of constructs that are represented consistently. Examples and counterexamples can be evaluated. Assignments can be assembled and checked with the help of an automated formal reasoning system. However, usability and accessibility do not have a high priority and are not addressed sufficiently well in the design of many existing machinereadable representations and corresponding formal reasoning systems. In earlier work [Lap09], we attempt to address this broad problem by proposing several specific design criteria organized around the notion of a natural context: the sphere of awareness a working human user maintains of the relevant constructs, arguments, experiences, and background materials necessary to accomplish the task at hand. We report on our attempt to evaluate our proposed design criteria by deploying within the classroom a lightweight formal verification system designed according to these criteria. The lightweight formal verification system was used within the instruction of a common application of formal reasoning: proving by induction formal propositions about functional code. We present all of the formal reasoning examples and assignments considered during this deployment, most of which are drawn directly from an introductory text on functional programming. We demonstrate how the design of the system improves the effectiveness and understandability of the examples, and how it aids in the instruction of basic formal reasoning techniques. We make brief remarks about the practical and administrative implications of the system’s design from the perspectives of the student, the instructor, and the grader. 1
Quantifier Elimination and Parametric Polymorphism in Programming Languages
 J. Functional Programming
, 1992
"... We present a simple and easy to understand explanation of ML type inference and parametric polymorphism within the framework of type monomorphism, as in the first order typed lambda calculus. We prove the equivalence of this system with the standard interpretation using type polymorphism, and extend ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
We present a simple and easy to understand explanation of ML type inference and parametric polymorphism within the framework of type monomorphism, as in the first order typed lambda calculus. We prove the equivalence of this system with the standard interpretation using type polymorphism, and extend the equivalence to include polymorphic fixpoints. The monomorphic interpretation gives a purely combinatorial understanding of the type inference problem, and is a classic instance of quantifier elimination, as well as an example of Gentzenstyle cut elimination in the framework of the CurryHoward propositionsastypes analogy. Supported by NSF Grant CCR9017125, and grants from Texas Instruments and from the Tyson Foundation. 1 Introduction In his influential paper, "A theory of type polymorphism in programming," Robin Milner proposed an extension to the first order typed calculus which has become known as the core of the ML programming language [Mil78, HMT90]. The extension augment...
A lambda calculus for real analysis
, 2005
"... Abstract Stone Duality is a revolutionary theory that works directly with computable continuous functions, without using set theory, infinitary lattice theory or a prior theory of discrete computation. Every expression in the calculus denotes both a continuous function and a program, but the reasoni ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract Stone Duality is a revolutionary theory that works directly with computable continuous functions, without using set theory, infinitary lattice theory or a prior theory of discrete computation. Every expression in the calculus denotes both a continuous function and a program, but the reasoning looks remarkably like a sanitised form of that in classical topology. This paper is an introduction to ASD for the general mathematician, and applies it to elementary real analysis. It culminates in the Intermediate Value Theorem, i.e. the solution of equations fx = 0 for continuous f: R → R. As is well known from both numerical and constructive considerations, the equation cannot be solved if f “hovers ” near 0, whilst tangential solutions will never be found. In ASD, both of these failures and the general method of finding solutions of the equation when they exist are explained by the new concept of “overtness”. The zeroes are captured, not as a set, but by highertype operators � and ♦ that remain (Scott) continuous across singularities of a parametric equation. Expressing topology in terms of continuous functions rather than sets of points leads to