Results 1  10
of
11
Rules of definitional reflection
 In Symposium on Logic and Computer Science
, 1993
"... This paper discusses two rules of definitional reflection: The “logical ” version of definitional reflection as used in the extended logic programming language GCLA and the “ω”version of definitional reflection as proposed by Eriksson and Girard. The logical version is a Leftintroduction rule comp ..."
Abstract

Cited by 57 (8 self)
 Add to MetaCart
This paper discusses two rules of definitional reflection: The “logical ” version of definitional reflection as used in the extended logic programming language GCLA and the “ω”version of definitional reflection as proposed by Eriksson and Girard. The logical version is a Leftintroduction rule completely analogous to the Leftintroduction rules for logical operators in Gentzenstyle sequent systems, whereas the ωversion extends the logical version by a principle related to the ωrule in arithmetic. Correspondingly, the interpretation of free variables differs between the two approaches, resulting in different principles of closure of inference rules under substitution. This difference is crucial for the computational interpretation of definitional reflection. 1
A Taxonomy of Csystems
, 2002
"... The logics of formal inconsistency (LFIs) are paraconsistent logics which permit us to internalize the concepts of consistency or inconsistency inside our object language, introducing new operators to talk about them, and allowing us, in principle, to logically separate the notions of contradictorin ..."
Abstract

Cited by 41 (15 self)
 Add to MetaCart
The logics of formal inconsistency (LFIs) are paraconsistent logics which permit us to internalize the concepts of consistency or inconsistency inside our object language, introducing new operators to talk about them, and allowing us, in principle, to logically separate the notions of contradictoriness and of inconsistency. We present the formal definitions of these logics in the context of General Abstract Logics, argue that they in fact represent the majority of all paraconsistent logics existing up to this point, if not the most exceptional ones, and we single out a subclass of them called Csystems, as the LFIs that are built over the positive basis of some given consistent logic. Given precise characterizations of some received logical principles, we point out that the gist of paraconsistent logic lies in the Principle of Explosion, rather than in the Principle of NonContradiction, and we also sharply distinguish these two from the Principle of NonTriviality, considering the next various weaker formulations of explosion, and investigating their interrelations. Subsequently, we present the syntactical formulations of some of the main Csystems based on classical logic, showing how several wellknown logics in the literature can be recast as such a kind of Csystems, and carefully study their properties and shortcomings, showing for instance how they can be used to faithfully
Systems of Illative Combinatory Logic complete for firstorder propositional and predicate calculus
, 1993
"... Illative combinatory logic consists of the theory of combinators or lambda calculus extended by extra constants (and corresponding axioms and rules) intended to capture inference. The paper considers systems of illative combinatory logic that are sound for first order propositional and predicate cal ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Illative combinatory logic consists of the theory of combinators or lambda calculus extended by extra constants (and corresponding axioms and rules) intended to capture inference. The paper considers systems of illative combinatory logic that are sound for first order propositional and predicate calculus. The interpretation from ordinary logic into the illative systems can be done in two ways: following the propositionsastypes paradigm, in which derivations become combinators, or in a more direct way, in which derivations are not translated. Both translations are closely related in a canonical way. The two direct translations turn out to be complete. The paper fulfills the program of Church [1932,33] and Curry [1930] to base logic on a consistent system of terms or combinators. Hitherto this program had failed because systems of ICL were either too weak (to provide a sound interpretation) or too strong (sometimes even inconsistent). 1 Research supported by the Australian Research C...
2004).On the notion of assumption in logical systems
 In R
"... When a logical system is specified and the notion of a derivation or formal proof is explained, we are told (i) which formulas can be used to start a derivation and (ii) which formulas can be derived given that certain other formulas have already been derived. Formulas of the sort (i) are either ass ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
When a logical system is specified and the notion of a derivation or formal proof is explained, we are told (i) which formulas can be used to start a derivation and (ii) which formulas can be derived given that certain other formulas have already been derived. Formulas of the sort (i) are either assumptions or axioms, formulas of the sort (ii) are conclusions of (proper) inference rules. Axioms may be viewed as conclusions of (improper) inference rules, viz. inference rules without premisses. In what follows I refer to conclusions of proper or improper inference rules as assertions. 1 In natural deduction systems, inference rules deal both with assumptions and assertions, as the assumptions on which the conclusion of an inference rule depends, are not necessarily given by the collection of all assumptions on which the premisses depend, in case the rule permits the discharging of assumptions. For example, the rule of implication introduction
On the Role of Implication in Formal Logic
, 1998
"... Evidence is given that implication (and its special case, negation) carry the logical strength of a system of formal logic. This is done by proving normalization and cut elimination for a system based on combinatory logic or #calculus with logical constants for and, or, all, and exists, but with no ..."
Abstract
 Add to MetaCart
Evidence is given that implication (and its special case, negation) carry the logical strength of a system of formal logic. This is done by proving normalization and cut elimination for a system based on combinatory logic or #calculus with logical constants for and, or, all, and exists, but with none for either implication or negation. The proof is strictly finitary, showing that this system is very weak. The results can be extended to a "classical" version of the system. They can also be extended to a system with a restricted set of rules for implication: the result is a system of intuitionistic higherorder BCK logic with unrestricted comprehension and without restriction on the rules for disjunction elimination and existential elimination. The result does not extend to the classical version of the BCK logic. 1991 AMS (MOS) Classification: 03B40, 03F05, 03B20 Key words: Implication, negation, combinatory logic, lambda calculus, comprehension principle, normalization, cutelimination...
LambdaCalculus and Functional Programming
"... This paper deals with the problem of a program that is essentially the same over any of several types but which, in the older imperative languages must be rewritten for each separate type. For example, a sort routine may be written with essentially the same code except for the types for integers, bo ..."
Abstract
 Add to MetaCart
This paper deals with the problem of a program that is essentially the same over any of several types but which, in the older imperative languages must be rewritten for each separate type. For example, a sort routine may be written with essentially the same code except for the types for integers, booleans, and strings. It is clearly desirable to have a method of writing a piece of code that can accept the specific type as an argument. Milner developed his ideas in terms of type assignment to lambdaterms. It is based on a result due originally to Curry (Curry 1969) and Hindley (Hindley 1969) known as the principal typescheme theorem, which says that (assuming that the typing assumptions are sufficiently wellbehaved) every term has a principal typescheme, which is a typescheme such that every other typescheme which can be proved for the given term is obtained by a substitution of types for type variables. This use of type schemes allows a kind of generality over all types, which is known as polymorphism.
On the Role of Implication in Formal Logic
"... Evidence is given that implication (and its special case, negation) carry the logical strength of a system of formal logic. This is done by proving normalization and cut elimination for a system based on combinatory logic or #calculus with logical constants for and, or, all, and exists, but with no ..."
Abstract
 Add to MetaCart
Evidence is given that implication (and its special case, negation) carry the logical strength of a system of formal logic. This is done by proving normalization and cut elimination for a system based on combinatory logic or #calculus with logical constants for and, or, all, and exists, but with none for either implication or negation. The proof is strictly finitary, showing that this system is very weak. The results can be extended to a "classical" version of the system. They can also be extended to a system with a restricted set of rules for implication: the result is a system of intuitionistic higherorder BCK logic with unrestricted comprehension and without restriction on the rules for disjunction elimination and existential elimination. The result does not extend to the classical version of the BCK logic. 1991 AMS (MOS) Classification: 03B40, 03F05, 03B20 Key words: Implication, negation, combinatory logic, lambda calculus, comprehension principle, normalization, cutelimination...
Dialetheic Truth Theory: Inconsistency, NonTriviality, Soundness, Incompleteness
"... Abstract. The bestknown application of dialetheism is to semantic paradoxes such as the Liar. In particular, Graham Priest has advocated the adoption of an axiomatic truth theory in which contradictions arising from the Liar paradox can be accepted as theorems, while the Liar sentence itself is eva ..."
Abstract
 Add to MetaCart
Abstract. The bestknown application of dialetheism is to semantic paradoxes such as the Liar. In particular, Graham Priest has advocated the adoption of an axiomatic truth theory in which contradictions arising from the Liar paradox can be accepted as theorems, while the Liar sentence itself is evaluated as being both true and false. Such eccentricities might be tolerated, in exchange for great rewards. But in this note I show that it is not possible in Priest’s truth theory to express certain semantic facts about that very theory, and thus that it enjoys no definite advantage over more orthodox approaches to semantic paradox.
Handbook of the History of Logic. Volume 6
"... ABSTRACT: Here is a crude list, possibly summarizing the role of paradoxes within the framework of mathematical logic: 1. directly motivating important theories (e.g. type theory, axiomatic set theory, combinatory logic); 2. suggesting methods of proving fundamental metamathematical results (fixed p ..."
Abstract
 Add to MetaCart
ABSTRACT: Here is a crude list, possibly summarizing the role of paradoxes within the framework of mathematical logic: 1. directly motivating important theories (e.g. type theory, axiomatic set theory, combinatory logic); 2. suggesting methods of proving fundamental metamathematical results (fixed point theorems, incompleteness, undecidability, undefinability); 3. applying inductive definability and generalized recursion; 4. introducing new semantical methods (e. g. revision theory, semiinductive definitions, which require nontrivial set theoretic results); 5. (partly) enhancing new axioms in set theory: the case of antifoundation AFA and the mathematics of circular phenomena; 6. suggesting the investigation of nonclassical logical systems, from contractionfree and manyvalued logics to systems with generalized quantifiers; 7. suggesting frameworks with flexible typing for the foundations of Mathematics and Computer Science; 8. applying forms of selfreferential truth and in Artificial Intelligence, Theoretical Linguistics, etc. Below we attempt to shed some light on the genesis of the issues 1–8 through the history of the paradoxes in the twentieth century, with a special emphasis on semantical aspects.
HISTORY AND PHILOSOPHY OF LOGIC, 24 (2003), 15–37 Russell’s 1903 – 1905 Anticipation of the Lambda Calculus
, 2002
"... It is well known that the circumflex notation used by Russell and Whitehead to form complex function names in Principia Mathematica played a role in inspiring Alonzo Church’s ‘Lambda Calculus ’ for functional logic developed in the 1920s and 1930s. Interestingly, earlier unpublished manuscripts writ ..."
Abstract
 Add to MetaCart
It is well known that the circumflex notation used by Russell and Whitehead to form complex function names in Principia Mathematica played a role in inspiring Alonzo Church’s ‘Lambda Calculus ’ for functional logic developed in the 1920s and 1930s. Interestingly, earlier unpublished manuscripts written by Russell between 1903 and 1905—surely unknown to Church—contain a more extensive anticipation of the essential details of the Lambda Calculus. Russell also anticipated Schönfinkel’s Combinatory Logic approach of treating multiargument functions as functions having other functions as value. Russell’s work in this regard seems to have been largely inspired by Frege’s theory of functions and ‘valueranges’. This system was discarded by Russell due to his abandonment of propositional functions as genuine entities as part of a new tack for solving Russell’s paradox. In this article, I explore the genesis and demise of Russell’s early anticipation of the Lambda Calculus.