Results 1 
7 of
7
Predicative Recursion and Computational Complexity
, 1992
"... The purpose of this thesis is to give a "foundational" characterization of some common complexity classes. Such a characterization is distinguished by the fact that no explicit resource bounds are used. For example, we characterize the polynomial time computable functions without making any direct r ..."
Abstract

Cited by 45 (3 self)
 Add to MetaCart
The purpose of this thesis is to give a "foundational" characterization of some common complexity classes. Such a characterization is distinguished by the fact that no explicit resource bounds are used. For example, we characterize the polynomial time computable functions without making any direct reference to polynomials, time, or even computation. Complexity classes characterized in this way include polynomial time, the functional polytime hierarchy, the logspace decidable problems, and NC. After developing these "resource free" definitions, we apply them to redeveloping the feasible logical system of Cook and Urquhart, and show how this firstorder system relates to the secondorder system of Leivant. The connection is an interesting one since the systems were defined independently and have what appear to be very different rules for the principle of induction. Furthermore it is interesting to see, albeit in a very specific context, how to retract a second order statement, ("inducti...
Effective model theory: the number of models and their complexity
 MODELS AND COMPUTABILITY
, 1999
"... Effective model theory studies model theoretic notions with an eye towards issues of computability and effectiveness. We consider two possible starting points. If the basic objects are taken to be theories, then the appropriate effective version investigates decidable theories (the set of theorems i ..."
Abstract

Cited by 18 (6 self)
 Add to MetaCart
Effective model theory studies model theoretic notions with an eye towards issues of computability and effectiveness. We consider two possible starting points. If the basic objects are taken to be theories, then the appropriate effective version investigates decidable theories (the set of theorems is computable) and decidable structures (ones with decidable theories). If the objects of initial interest are typical mathematical structures, then the starting point is computable structures. We present an introduction to both of these aspects of effective model theory organized roughly around the themes of the number and types of models of theories with particular attention to categoricity (as either a hypothesis or a conclusion) and the analysis of various computability issues in families of models.
Kleene’s Amazing Second Recursion Theorem (Extended Abstract)
"... This little gem is stated unbilled and proved (completely) in the last two lines of §2 of the short note Kleene (1938). In modern notation, with all the hypotheses stated explicitly and in a strong form, it reads as follows: Theorem 1 (SRT). Fix a set V ⊆ N, and suppose that for each natural number ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This little gem is stated unbilled and proved (completely) in the last two lines of §2 of the short note Kleene (1938). In modern notation, with all the hypotheses stated explicitly and in a strong form, it reads as follows: Theorem 1 (SRT). Fix a set V ⊆ N, and suppose that for each natural number n ∈ N = {0, 1, 2,...}, ϕ n: N n+1 ⇀ V is a recursive partial function of (n + 1) arguments with values in V so that the standard assumptions (1) and (2) hold with {e}(⃗x) = ϕ n e (⃗x) = ϕ n (e, ⃗x) (⃗x = (x1,..., xn) ∈ N n). (1) Every nary recursive partial function with values in V is ϕ n e for some e. (2) For all m, n, there is a recursive (total) function S = S m n: N m+1 → N such that {S(e, ⃗y)}(⃗x) = {e}(⃗y, ⃗x) (e ∈ N, ⃗y ∈ N m, ⃗x ∈ N n). Then, for every recursive, partial function f(e, ⃗y, ⃗x) of (1+m+n) arguments with values in V, there is a total recursive function ˜z(⃗y) of m arguments such that
INTERPRETABILITY IN ROBINSON’S Q
"... Abstract. Edward Nelson published in 1986 a book defending an extreme formalist view of mathematics according to which there is an impassable barrier in the totality of exponentiation. On the positive side, Nelson embarks on a program of investigating how much mathematics can be interpreted in Rapha ..."
Abstract
 Add to MetaCart
Abstract. Edward Nelson published in 1986 a book defending an extreme formalist view of mathematics according to which there is an impassable barrier in the totality of exponentiation. On the positive side, Nelson embarks on a program of investigating how much mathematics can be interpreted in Raphael Robinson’s theory of arithmetic Q. In the shadow of this program, some very nice logical investigations and results were produced by a number of people, not only regarding what can be interpreted in Q but also what cannot be so interpreted. We explain some of these results and rely on them to discuss Nelson’s position. §1. Introduction. Let L be the firstorder language with equality whose nonlogical symbols are the constant 0, the unary function symbol S (for successor) and two binary function symbols + (for addition) and · (for multiplication). The following theory was introduced in [35] (see also the systematic [42]): Definition 1. Raphael Robinson’s theory Q is the theory in the language L
1 THE INEVITABILITY OF LOGICAL STRENGTH:
, 2007
"... Abstract. An extreme kind of logic skeptic claims that "the present formal systems used for the foundations of mathematics are artificially strong, thereby causing unnecessary headaches such as the Gödel incompleteness phenomena". The skeptic continues by claiming that "logician's sys ..."
Abstract
 Add to MetaCart
Abstract. An extreme kind of logic skeptic claims that "the present formal systems used for the foundations of mathematics are artificially strong, thereby causing unnecessary headaches such as the Gödel incompleteness phenomena". The skeptic continues by claiming that "logician's systems always contain overly general assertions, and/or assertions about overly general notions, that are not used in any significant way in normal mathematics. For example, induction for all statements, or even all statements of certain restricted forms, is far too general mathematicians only use induction for natural statements that actually arise. If logicians would tailor their formal systems to conform to the naturalness of normal mathematics, then various logical difficulties would disappear, and the story of the foundations of mathematics