Results 1 
9 of
9
The Theory of LEGO  A Proof Checker for the Extended Calculus of Constructions
, 1994
"... LEGO is a computer program for interactive typechecking in the Extended Calculus of Constructions and two of its subsystems. LEGO also supports the extension of these three systems with inductive types. These type systems can be viewed as logics, and as meta languages for expressing logics, and LEGO ..."
Abstract

Cited by 69 (10 self)
 Add to MetaCart
LEGO is a computer program for interactive typechecking in the Extended Calculus of Constructions and two of its subsystems. LEGO also supports the extension of these three systems with inductive types. These type systems can be viewed as logics, and as meta languages for expressing logics, and LEGO is intended to be used for interactively constructing proofs in mathematical theories presented in these logics. I have developed LEGO over six years, starting from an implementation of the Calculus of Constructions by G erard Huet. LEGO has been used for problems at the limits of our abilities to do formal mathematics. In this thesis I explain some aspects of the metatheory of LEGO's type systems leading to a machinechecked proof that typechecking is decidable for all three type theories supported by LEGO, and to a verified algorithm for deciding their typing judgements, assuming only that they are normalizing. In order to do this, the theory of Pure Type Systems (PTS) is extended and f...
On the unity of duality
 Special issue on “Classical Logic and Computation
, 2008
"... Most type systems are agnostic regarding the evaluation strategy for the underlying languages, with the value restriction for ML which is absent in Haskell as a notable exception. As type systems become more precise, however, detailed properties of the operational semantics may become visible becaus ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
Most type systems are agnostic regarding the evaluation strategy for the underlying languages, with the value restriction for ML which is absent in Haskell as a notable exception. As type systems become more precise, however, detailed properties of the operational semantics may become visible because properties captured by the types may be sound under one strategy but not the other. For example, intersection types distinguish between callbyname and callbyvalue functions, because the subtyping law (A → B) ∩ (A → C) ≤ A → (B ∩ C) is unsound for the latter in the presence of effects. In this paper we develop a prooftheoretic framework for analyzing the interaction of types with evaluation order, based on the notion of polarity. Polarity was discovered through linear logic, but we propose a fresh origin in Dummett’s program of justifying the logical laws through alternative verificationist or pragmatist “meaningtheories”, which include a bias towards either introduction or elimination rules. We revisit Dummett’s analysis using the tools of MartinLöf’s judgmental method, and then show how to extend it to a unified polarized logic, with Girard’s “shift ” connectives acting as intermediaries. This logic safely combines intuitionistic and dual intuitionistic reasoning principles, while simultaneously admitting a focusing interpretation for the classical sequent calculus. Then, by applying the CurryHoward isomorphism to polarized logic, we obtain a single programming language in which evaluation order is reflected at the level of types. Different logical notions correspond directly to natural programming constructs, such as patternmatching, explicit substitutions, values and callbyvalue continuations. We give examples demonstrating the expressiveness of the language and type system, and prove a basic but modular type safety result. We conclude with a brief discussion of extensions to the language with additional effects and types, and sketch the sort of explanation this can provide for operationallysensitive typing phenomena. 1
An Interpretation of Kleene's Slash in Type Theory
 Informal Proceedings of the Second Workshop on Logical Frameworks, pages 337342. Esprit Basic Research Action
, 1993
"... Kleene introduced the notion of slash to investigate the disjunction and existence properties under implication for intuitionistic arithmetic. In this paper Kleene's slash is translated to type theory. Besides translations of Kleene's results, the main application of the slash in type theo ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Kleene introduced the notion of slash to investigate the disjunction and existence properties under implication for intuitionistic arithmetic. In this paper Kleene's slash is translated to type theory. Besides translations of Kleene's results, the main application of the slash in type theory is that conditions are given for a typable term, containing free variables, to have a normal form beginning with a constructor. 1 Introduction The disjunction and existence properties, that is, ` AB implies ` A or ` B and ` 9xA(x) implies ` A(t) for some term t , respectively, were first proved for intuitionistic arithmetic by Kleene [9] using a modification of recursive realizability. Harrop [8] extended Kleene's result by also considering derivations depending on assumptions. Harrop proved C ` A B implies C ` A or C ` B (ED) C ` 9xA(x) implies C ` A(t) for some term t (EE) where C is a closed formula not containing any strictly positive occurrences of and 9 ; such a formula is called a Har...
Revamping the Restriction Strategy by
, 2007
"... This study continues the antirealist’s quest for a principled way to avoid Fitch’s paradox. It is proposed that the Cartesian restriction on the antirealist’s knowability principle ‘ϕ, therefore ✸Kϕ ’ should be formulated as a consistency requirement not on the premise ϕ of an application of the r ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
This study continues the antirealist’s quest for a principled way to avoid Fitch’s paradox. It is proposed that the Cartesian restriction on the antirealist’s knowability principle ‘ϕ, therefore ✸Kϕ ’ should be formulated as a consistency requirement not on the premise ϕ of an application of the rule, but rather on the set of assumptions on which the relevant occurrence of ϕ depends. It is stressed, by reference to illustrative proofs, how important it is to have proofs in normal form before applying the proposed restriction. A similar restriction is proposed for the converse inference, the socalled Rule of Factiveness ‘✸Kϕ therefore ϕ’. The proposed restriction appears to block another Fitchstyle derivation that uses the KKthesis in order to get around the Cartesian restriction on applications of the knowability principle. ∗ To appear in Joseph Salerno, ed., All Truths are Known: New Essays on the Knowability Paradox, Oxford University Press. This paper would not have been written without the stimulation, encouragement and criticism that I have enjoyed from Joseph Salerno, Salvatore Florio, Christina Moisa, Nicholaos Jones, and Patrick Reeder.
unknown title
"... It is not unreasonable to think that the dispute between classical and intuitionistic mathematics might be unresolvable or 'faultless', in the sense of there being no objective way to settle it. If so, we would have a pretty case of relativism. In this note I argue, however, that there is ..."
Abstract
 Add to MetaCart
It is not unreasonable to think that the dispute between classical and intuitionistic mathematics might be unresolvable or 'faultless', in the sense of there being no objective way to settle it. If so, we would have a pretty case of relativism. In this note I argue, however, that there is in fact not even disagreement in any interesting sense, let alone a faultless one, in spite of appearances and claims to the contrary. A position I call classical pluralism is sketched, intended to provide a coherent methodological stance towards the issue. Some reasons to recommend this stance are given, as well as some speculations as to why not everyone might want to follow the recommendation. 1.
Inferentialism, Logicism, Harmony, and a Counterpoint by
, 2007
"... Inferentialism is explained as an attempt to provide an account of meaning that is more sensitive (than the tradition of truthconditional theorizing deriving from Tarski and Davidson) to what is learned when one masters meanings. The logically reformist inferentialism of Dummett and Prawitz is cont ..."
Abstract
 Add to MetaCart
Inferentialism is explained as an attempt to provide an account of meaning that is more sensitive (than the tradition of truthconditional theorizing deriving from Tarski and Davidson) to what is learned when one masters meanings. The logically reformist inferentialism of Dummett and Prawitz is contrasted with the more recent quietist inferentialism of Brandom. Various other issues are highlighted for inferentialism in general, by reference to which different kinds of inferentialism can be characterized. Inferentialism for the logical operators is explained, with special reference to the Principle of Harmony. The statement of that principle in the author’s book Natural Logic is finetuned here in the way obviously required in order to bar an interesting wouldbe counterexample furnished by Crispin Wright, and to stave off any more of the same.
by
, 2007
"... This study is in two parts. In the first part, various important principles of classical extensional mereology are derived on the basis of a nice axiomatization involving ‘part of ’ and fusion. All results are proved here with full Fregean (and Gentzenian) rigor. They are chosen because they are nee ..."
Abstract
 Add to MetaCart
This study is in two parts. In the first part, various important principles of classical extensional mereology are derived on the basis of a nice axiomatization involving ‘part of ’ and fusion. All results are proved here with full Fregean (and Gentzenian) rigor. They are chosen because they are needed for the second part. In the second part, this naturaldeduction framework is used in order to regiment David Lewis’s justification of his Division Thesis, which features prominently in his combination of mereology with class theory. The Division Thesis plays a crucial role in Lewis’s informal argument for his Second Thesis in his book Parts of Classes. In order to present Lewis’s argument in rigorous detail, an elegant new principle is offered for the theory that combines class theory and mereology. The new principle is called the Canonical Decomposition Thesis. It secures Lewis’s Division Thesis on the strong construal required in order for his
Thesis Proposal: The logical basis of evaluation order
, 2007
"... Most type systems are agnostic regarding the evaluation strategy for the underlying languages, with the value restriction for ML which is absent in Haskell as a notable exception. As type systems become more precise, however, detailed properties of the underlying operational semantics may become vis ..."
Abstract
 Add to MetaCart
Most type systems are agnostic regarding the evaluation strategy for the underlying languages, with the value restriction for ML which is absent in Haskell as a notable exception. As type systems become more precise, however, detailed properties of the underlying operational semantics may become visible because properties captured by the types may be sound under one strategy but not the other. To give an example, intersection types distinguish between callbyname and callbyvalue functions because the subtyping rule (A → B) ∩ (A → C) ≤ A → (B ∩ C) is valid for the former but not the latter in the presence of effects. I propose to develop a unified, prooftheoretic approach to analyzing the interaction of types with evaluation order, based on the notion of polarity. Polarity was discovered and developed through linear logic, but I seek a fresh origin in Dummett’s program of justifying the logical laws through alternative “meaningtheories, ” essentially hypotheses as to whether the verification or use of a proposition has a canonical form. In my preliminary work, I showed how a careful judgmental analysis of Dummett’s ideas may be used to define a system of proofs and refutations, with a CurryHoward interpretation as a single programming language in which the duality between callbyvalue and callbyname is realized as one of types. After extending its type system with (both positive and negative) union and intersection operators and a derived subtyping relationship, I found that many operationallysensitive typing phenomena (e.g., alternative CBV/CBN subtyping distributivity principles, value and “covalue” restrictions) could be logically reconstructed. Here I give the technical details of this work, and present a plan for addressing open questions and extensions.