Results 1  10
of
31
Light Linear Logic
"... The abuse of structural rules may have damaging complexity effects. ..."
Abstract

Cited by 669 (3 self)
 Add to MetaCart
The abuse of structural rules may have damaging complexity effects.
Light Affine Logic
 ACM TRANSACTIONS ON COMPUTATIONAL LOGIC
, 1998
"... Much effort has been recently devoted to the study of polytime formal (and especially logical) systems [GSS92, LM93, Le94, Gi96]. The purpose of such systems is manyfold. On the theoretical side, they provide a better understanding of what is the logical essence of polytime reduction (and other comp ..."
Abstract

Cited by 67 (4 self)
 Add to MetaCart
Much effort has been recently devoted to the study of polytime formal (and especially logical) systems [GSS92, LM93, Le94, Gi96]. The purpose of such systems is manyfold. On the theoretical side, they provide a better understanding of what is the logical essence of polytime reduction (and other complexity classes). On the practical side, via the well known CurryHoward correspondence, they yield sophisticated typing systems, where types provide (statically) an accurate upper bound on the complexity of the computation. Even more, the type annotations give essential information on the "efficient way" to reduce the term. The most promising of these logical systems is Girard 's Light Linear Logic [Gi96] (see the same paper for a comparison with other approaches). In this paper, we introduce a slight variation of LLL, by adding full weakening (for this reason, we call it Light Affine Logic). This modification does not alter the good complexity properties of LLL: cutelimination is still pol...
Higher Order Logic
 In Handbook of Logic in Artificial Intelligence and Logic Programming
, 1994
"... Contents 1 Introduction : : : : : : : : : : : : : : : : : : : : : : : : : : : : 2 2 The expressive power of second order Logic : : : : : : : : : : : 3 2.1 The language of second order logic : : : : : : : : : : : : : 3 2.2 Expressing size : : : : : : : : : : : : : : : : : : : : : : : : 4 2.3 Definin ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
Contents 1 Introduction : : : : : : : : : : : : : : : : : : : : : : : : : : : : 2 2 The expressive power of second order Logic : : : : : : : : : : : 3 2.1 The language of second order logic : : : : : : : : : : : : : 3 2.2 Expressing size : : : : : : : : : : : : : : : : : : : : : : : : 4 2.3 Defining data types : : : : : : : : : : : : : : : : : : : : : 6 2.4 Describing processes : : : : : : : : : : : : : : : : : : : : : 8 2.5 Expressing convergence using second order validity : : : : : : : : : : : : : : : : : : : : : : : : : 9 2.6 Truth definitions: the analytical hierarchy : : : : : : : : 10 2.7 Inductive definitions : : : : : : : : : : : : : : : : : : : : : 13 3 Canonical semantics of higher order logic : : : : : : : : : : : : 15 3.1 Tarskian semantics of second order logic : : : : : : : : : 15 3.2 Function and re
Theories With SelfApplication and Computational Complexity
 Information and Computation
, 2002
"... Applicative theories form the basis of Feferman's systems of explicit mathematics, which have been introduced in the early seventies. In an applicative universe, all individuals may be thought of as operations, which can freely be applied to each other: selfapplication is meaningful, but n ..."
Abstract

Cited by 12 (9 self)
 Add to MetaCart
(Show Context)
Applicative theories form the basis of Feferman's systems of explicit mathematics, which have been introduced in the early seventies. In an applicative universe, all individuals may be thought of as operations, which can freely be applied to each other: selfapplication is meaningful, but not necessarily total. It has turned out that theories with selfapplication provide a natural setting for studying notions of abstract computability, especially from a prooftheoretic perspective.
Choice and uniformity in weak applicative theories
 Logic Colloquium ’01
, 2005
"... Abstract. We are concerned with first order theories of operations, based on combinatory logic and extended with the type W of binary words. The theories include forms of “positive ” and “bounded ” induction on W and naturally characterize primitive recursive and polytime functions (respectively). W ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
Abstract. We are concerned with first order theories of operations, based on combinatory logic and extended with the type W of binary words. The theories include forms of “positive ” and “bounded ” induction on W and naturally characterize primitive recursive and polytime functions (respectively). We prove that the recursive content of the theories under investigation (i.e. the associated class of provably total functions on W) is invariant under addition of 1. an axiom of choice for operations and a uniformity principle, restricted to positive conditions; 2. a (form of) selfreferential truth, providing a fixed point theorem for predicates. As to the proof methods, we apply a kind of internal forcing semantics, nonstandard variants of realizability and cutelimination. §1. Introduction. In this paper, we deal with theories of abstract computable operations, underlying the socalled explicit mathematics, introduced by Feferman in the midseventies as a logical frame to formalize Bishop’s style constructive mathematics ([18], [19]). Following a common usage, these theories
A ProofTheoretic Characterization of the Basic Feasible Functionals
 Theoretical Computer Science
, 2002
"... We provide a natural characterization of the type two MehlhornCookUrquhart basic feasible functionals as the provably total type two functionals of our (classical) applicative theory PT introduced in [27], thus providing a proof of a result claimed in the conclusion of [27]. ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
(Show Context)
We provide a natural characterization of the type two MehlhornCookUrquhart basic feasible functionals as the provably total type two functionals of our (classical) applicative theory PT introduced in [27], thus providing a proof of a result claimed in the conclusion of [27].
Proof Theoretic Complexity
 IN PROOF AND SYSTEM RELIABILITY, H. SCHWICHTENBERG AND R. STEINBRÜGGEN, EDS. NATO SCIENCE SERIES
, 2002
"... A weak formal theory of arithmetic is developed, entirely analogous to classical arithmetic but with two separate kinds of variables: induction variables and quantifier variables. The point is that the provably recursive functions are now more feasibly computable than in the classical case, lying be ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
A weak formal theory of arithmetic is developed, entirely analogous to classical arithmetic but with two separate kinds of variables: induction variables and quantifier variables. The point is that the provably recursive functions are now more feasibly computable than in the classical case, lying between Grzegorczyk's E² and E³, and their computational complexity can be characterized in terms of the logical complexity of their termination proofs. Previous results of Leivant are reworked and extended in this new setting, with quite di#erent proof theoretic methods.
Linear Logic by Levels and Bounded Time Complexity
, 2009
"... This work deals with the characterization of elementary and deterministic polynomial time computation in linear logic through the proofsasprograms correspondence. Girard’s seminal results, concerning elementary and light linear logic, use a principle called stratification to ensure the complexity b ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
This work deals with the characterization of elementary and deterministic polynomial time computation in linear logic through the proofsasprograms correspondence. Girard’s seminal results, concerning elementary and light linear logic, use a principle called stratification to ensure the complexity bound on the cutelimination procedure. Here, we propose a more flexible control principle, that of indexing, which allows us to extend Girard’s systems while keeping the same complexity properties. A consequence of the higher flexibility of indexing with respect to stratification is the absence of boxes for handling the § modality. We finally propose a variant of our polytime system in which the § modality is only allowed on atoms, and which may thus serve as a basis for developing λcalculus type assignment systems with more efficient typing algorithms than existing ones.
Sharply bounded alternation and quasilinear time
 Theory of Computing Systems
, 1998
"... We de ne the sharply bounded hierarchy, SBH (QL), a hierarchy of classes within P, using quasilineartime computation and quanti cation over strings of length log n. It generalizes the limited nondeterminism hierarchy introduced by Buss and Goldsmith, while retaining the invariance properties. The n ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We de ne the sharply bounded hierarchy, SBH (QL), a hierarchy of classes within P, using quasilineartime computation and quanti cation over strings of length log n. It generalizes the limited nondeterminism hierarchy introduced by Buss and Goldsmith, while retaining the invariance properties. The new hierarchy hasseveral alternative characterizations. We de ne both SBH (QL) and its corresponding hierarchy of function classes, ql and present a variety of problems in these classes, including mcomplete problems for each class in SBH (QL). We discuss the structure of the hierarchy, and show that determining its precise relationship to deterministic time classes can imply P 6 = PSPACE. We present characterizations of SBH (QL) relations based on alternating Turing machines and on rstorder de nability, aswell as recursiontheoretic characterizations of function classes corresponding to SBH (QL).
Why philosophers should care about computational complexity
 In Computability: Gödel, Turing, Church, and beyond (eds
, 2012
"... One might think that, once we know something is computable, how efficiently it can be computed is a practical question with little further philosophical importance. In this essay, I offer a detailed casethat onewouldbe wrong. In particular, I arguethat computational complexity theory—the field that ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
One might think that, once we know something is computable, how efficiently it can be computed is a practical question with little further philosophical importance. In this essay, I offer a detailed casethat onewouldbe wrong. In particular, I arguethat computational complexity theory—the field that studies the resources (such as time, space, and randomness) needed to solve computational problems—leads to new perspectives on the nature of mathematical knowledge, the strong AI debate, computationalism, the problem of logical omniscience, Hume’s problem of induction, Goodman’s grue riddle, the foundations of quantum mechanics, economic rationality, closed timelike curves, and several other topics of philosophical interest. I end by discussing