Results 1  10
of
25
A New RecursionTheoretic Characterization Of The Polytime Functions
 COMPUTATIONAL COMPLEXITY
, 1992
"... We give a recursiontheoretic characterization of FP which describes polynomial time computation independently of any externally imposed resource bounds. In particular, this syntactic characterization avoids the explicit size bounds on recursion (and the initial function 2 xy ) of Cobham. ..."
Abstract

Cited by 232 (7 self)
 Add to MetaCart
We give a recursiontheoretic characterization of FP which describes polynomial time computation independently of any externally imposed resource bounds. In particular, this syntactic characterization avoids the explicit size bounds on recursion (and the initial function 2 xy ) of Cobham.
Predicative Recursion and Computational Complexity
, 1992
"... The purpose of this thesis is to give a "foundational" characterization of some common complexity classes. Such a characterization is distinguished by the fact that no explicit resource bounds are used. For example, we characterize the polynomial time computable functions without making an ..."
Abstract

Cited by 45 (3 self)
 Add to MetaCart
The purpose of this thesis is to give a "foundational" characterization of some common complexity classes. Such a characterization is distinguished by the fact that no explicit resource bounds are used. For example, we characterize the polynomial time computable functions without making any direct reference to polynomials, time, or even computation. Complexity classes characterized in this way include polynomial time, the functional polytime hierarchy, the logspace decidable problems, and NC. After developing these "resource free" definitions, we apply them to redeveloping the feasible logical system of Cook and Urquhart, and show how this firstorder system relates to the secondorder system of Leivant. The connection is an interesting one since the systems were defined independently and have what appear to be very different rules for the principle of induction. Furthermore it is interesting to see, albeit in a very specific context, how to retract a second order statement, ("inducti...
A uniform approach to fundamental sequences and hierarchies
 Math. Logic Quart
, 1994
"... In this article we give a unifying approach to the theory of fundamental sequences and their related Hardy hierarchies of numbertheoretic functions and we show the equivalence of the new approach with the classical one. ..."
Abstract

Cited by 28 (7 self)
 Add to MetaCart
(Show Context)
In this article we give a unifying approach to the theory of fundamental sequences and their related Hardy hierarchies of numbertheoretic functions and we show the equivalence of the new approach with the classical one.
Long Finite Sequences
, 1998
"... Let k be a positive integer. There is a longest finite sequence x 1 ,...,x n in k letters in which no consecutive block x i ,...,x 2i is a subsequence of any other consecutive block x j ,...,x 2j . Let n(k) be this longest length. We prove that n(1) = 3, n(2) = 11, and n(3) is incomprehensibly large ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
Let k be a positive integer. There is a longest finite sequence x 1 ,...,x n in k letters in which no consecutive block x i ,...,x 2i is a subsequence of any other consecutive block x j ,...,x 2j . Let n(k) be this longest length. We prove that n(1) = 3, n(2) = 11, and n(3) is incomprehensibly large. We give a lower bound for n(3) in terms of the familiar Ackerman hierarchy. We also give asymptotic upper and lower bounds for n(k). We view n(3) as a particularly elemental description of an incomprehensibly large integer. Related problems involving binary sequences (two letters) are also addressed. We also report on some recent computer explorations of R. Dougherty which we use to raise the lower bound for n(3).
Some Decision Problems of Enormous Complexity
 In Proc., LICS’99
, 1999
"... We present some new decision and comparison problems of unusually high computational complexity. Most of the problems are strictly combinatorial in nature; others involve basic logical notions. Their complexities range from iterated exponential time completeness to # 0 time completeness to #(# # , ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
We present some new decision and comparison problems of unusually high computational complexity. Most of the problems are strictly combinatorial in nature; others involve basic logical notions. Their complexities range from iterated exponential time completeness to # 0 time completeness to #(# # ,0) time completeness to #(# # ,,0) time completeness. These three ordinals are well known ordinals from proof theory, and their associated complexity classes represent new levels of computational complexity for natural decision problems. Proofs will appear in an extended version of this manuscript to be published elsewhere. 1. Iterated exponential time  universal relational sentences Let F be a function from A* into B*, where A,B are finite alphabets. We say that F is iterated exponential time computable if and only if there is a multitape Turing machine TM (which processes inputs from A* and outputs from B*) and an integer constant c > 0 such that TM computes F(x) with run time at most...
An Intensional Semantics for Elementary Program Transformations
, 1993
"... This paper is a contribution to the formal study and analysis of vernacular forms of program derivation. Specifically, in this paper, our vernacular derivations are elementary program transformations over the natural numbers. We provide an intensional semantics for these transformations within the d ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
This paper is a contribution to the formal study and analysis of vernacular forms of program derivation. Specifically, in this paper, our vernacular derivations are elementary program transformations over the natural numbers. We provide an intensional semantics for these transformations within the derivations of the Elementary theory of Operations and Numbers, EON, [Bee85]. This semantics is intensional in the sense that the computational content of a derivation associated with a transformation is equal, up to the intensional equality underlying the theory EON, to the computational content of the transformation itself. The interpretation enables us to underwrite the correctness of the program transformations and, further, provides an analysis of correctness by classifying, via schema, the operations available by these transformations. 2 Introduction What is the relationship between vernacular and formalised arguments? We are interested in developing methods and results which can be...
Things that can and things that can't be done in PRA
, 1998
"... It is wellknown by now that large parts of (nonconstructive) mathematical reasoning can be carried out in systems T which are conservative over primitive recursive arithmetic PRA (and even much weaker systems). On the other hand there are principles S of elementary analysis (like the BolzanoW ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
It is wellknown by now that large parts of (nonconstructive) mathematical reasoning can be carried out in systems T which are conservative over primitive recursive arithmetic PRA (and even much weaker systems). On the other hand there are principles S of elementary analysis (like the BolzanoWeierstra principle, the existence of a limit superior for bounded sequences etc.) which are known to be equivalent to arithmetical comprehension (relative to T ) and therefore go far beyond the strength of PRA (when added to T ). In this paper
What Can we Gain by Integrating a Language Processor with a Theorem Prover?
, 2003
"... this paper is to investigate the impact on the design of a programming language of tight integration of a language processor with a theorem prover (intelligent proof assistant). What improvements in syntax, semantics, and computation do we gain by this? We assume as selfevident the obvious gain wh ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
this paper is to investigate the impact on the design of a programming language of tight integration of a language processor with a theorem prover (intelligent proof assistant). What improvements in syntax, semantics, and computation do we gain by this? We assume as selfevident the obvious gain which is quite substantial. The language obtains a sound semantics by its interpretation into a formal logical theory. By proving theorems about our programs we can prove them correct. From the many language systems with theorem provers we mention just two: PVS and Isabelle/HOL [6, 5]. Both of them are more general theorem provers than programming languages. On the other hand, our system CL (Clausal Language) , has been designed as a programming language from the start. It is not a language which can be used for industrial applications (yet, we hope), but neither is it a toy language. It has been used in our undergraduate teaching for seven years now. About four hundred students yearly actively use it in the four courses based on CL [2]. While, the formal basis for PVS and Isabelle is high order logic with typed functionals, CL is based on the simplest nontrivial formal theory: Peano Arithmetic (PA)