Results 1  10
of
18
Light Affine Set Theory: A Naive Set Theory of Polynomial Time
, 2004
"... In [7], a naive set theory is introduced based on a polynomial time logical system, Light Linear Logic (LLL). Although it is reasonably claimed that the set theory inherits the intrinsically polytime character from the underlying logic LLL, the discussion there is largely informal, and a formal ju ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
In [7], a naive set theory is introduced based on a polynomial time logical system, Light Linear Logic (LLL). Although it is reasonably claimed that the set theory inherits the intrinsically polytime character from the underlying logic LLL, the discussion there is largely informal, and a formal justification of the claim is not provided sufficiently. Moreover, the syntax is quite complicated in that it is based on a nontraditional hybrid sequent calculus which is required for formulating LLL. In this paper, we consider a naive set theory based on Intuitionistic Light Affine Logic (ILAL), a simplification of LLL introduced by [1], and call it Light Affine Set Theory (LAST). The simplicity of LAST allows us to rigorously verify its polytime character. In particular, we prove that a function over {0, 1} ∗ is computable in polynomial time if and only if it is provably total in LAST.
A feasible algorithm for typing in elementary affine logic
 In Proceedings of the 8th International Conference on Typed LambdaCalculi and Applications
, 2005
"... We give a new type inference algorithm for typing lambdaterms in Elementary Affine Logic (EAL), which is motivated by applications to complexity and optimal reduction. Following previous references on this topic, the variant of EAL type system we consider (denoted EAL ⋆ ) is a variant without shari ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
We give a new type inference algorithm for typing lambdaterms in Elementary Affine Logic (EAL), which is motivated by applications to complexity and optimal reduction. Following previous references on this topic, the variant of EAL type system we consider (denoted EAL ⋆ ) is a variant without sharing and without polymorphism. Our algorithm improves over the ones already known in that it offers a better complexity bound: if a simple type derivation for the term t is given our algorithm performs EAL ⋆ type inference in polynomial time. 1
Optimizing optimal reduction. A type inference algorithm for elementary affine logic
 ACM Transactions on Computational Logic
"... We propose a type inference algorithm for lambda terms in Elementary Affine Logic (EAL). The algorithm decorates the syntax tree of a simple typed lambda term and collects a set of linear constraints. The result is a parametric elementary type that can be instantiated with any solution of the set of ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We propose a type inference algorithm for lambda terms in Elementary Affine Logic (EAL). The algorithm decorates the syntax tree of a simple typed lambda term and collects a set of linear constraints. The result is a parametric elementary type that can be instantiated with any solution of the set of collected constraints. We point out that the typeability of lambda terms in EAL has a practical counterpart, since it is possible to reduce any EALtypeable lambda terms with the Lamping’s abstract algorithm obtaining a substantial increasing of performances. We show how to apply the same techniques to obtain decorations of intuitionistic proofs into Linear Logic proofs.
Context semantics, linear logic and computational complexity
 In Proc. 21th IEEE Syposium on Logic in Computer Science
, 2006
"... We show that context semantics can be fruitfully used to estimate the computational cost of proof normalization in linear logic. In particular, context semantics lets us define the weight of a proofnet in such a way that the time needed to normalize a given proof is deeply related to its weight: th ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
We show that context semantics can be fruitfully used to estimate the computational cost of proof normalization in linear logic. In particular, context semantics lets us define the weight of a proofnet in such a way that the time needed to normalize a given proof is deeply related to its weight: the time needed to normalize a proofnet is bounded by a polynomial on its weight, while there are strategies such that the weight is a lower bound to normalization time. 1
An Elementary Fragment of SecondOrder Lambda Calculus
 ACM Transactions on Computational Logic
, 2005
"... A fragment of secondorder lambda calculus (System F) is defined that characterizes the elementary recursive functions. Type quantification is restricted to be noninterleaved and stratified, i.e., the types are assigned levels, and a quantified variable can only be instantiated by a type of smaller ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
A fragment of secondorder lambda calculus (System F) is defined that characterizes the elementary recursive functions. Type quantification is restricted to be noninterleaved and stratified, i.e., the types are assigned levels, and a quantified variable can only be instantiated by a type of smaller level, with a slightly liberalized treatment of the level zero.
Obsessional experiments for Linear Logic Proofnets
 Mathematical Structures in Computer Science 13
, 2001
"... We address the question of injectivity of coherent semantics of linear logic proofnets. Starting from Girard's denition of experiment, we introduce the keynotion of \injective obsessional experiment ", which allows to give a positive answer to our question for certain fragments of linear logic ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We address the question of injectivity of coherent semantics of linear logic proofnets. Starting from Girard's denition of experiment, we introduce the keynotion of \injective obsessional experiment ", which allows to give a positive answer to our question for certain fragments of linear logic, and to build counterexamples to the injectivity of coherent semantics in the general case.
Mathematical Logic—Proof theory
"... This paper fits in the area of implicit polytime computational systems [Girard et al. 1998; Leivant and Marion 1993; Leivant 1994; Girard 1998]. The purpose of such systems is manifold. On the theoretical side, they provide a better understanding about the logical essence of calculating with time re ..."
Abstract
 Add to MetaCart
This paper fits in the area of implicit polytime computational systems [Girard et al. 1998; Leivant and Marion 1993; Leivant 1994; Girard 1998]. The purpose of such systems is manifold. On the theoretical side, they provide a better understanding about the logical essence of calculating with time restrictions. Those ones admitting a CurryHoward correspondence
Università di Bologna
"... This article is a structured introduction to Intuitionistic Light Affine Logic (ILAL). ILAL has a polynomially costing normalization, and it is expressive enough to encode, and simulate, all PolyTime Turing machines. The bound on the normalization cost is proved by introducing the proofnets for ILA ..."
Abstract
 Add to MetaCart
This article is a structured introduction to Intuitionistic Light Affine Logic (ILAL). ILAL has a polynomially costing normalization, and it is expressive enough to encode, and simulate, all PolyTime Turing machines. The bound on the normalization cost is proved by introducing the proofnets for ILAL. The bound follows from a suitable normalization strategy that exploits structural properties of the proofnets. This allows us to have a good understanding of the meaning of the § modality, which is a peculiarity of light logics. The expressive power of ILAL is demonstrated in full detail. Such a proof gives a hint of the nontrivial task of programming with resource limitations, using ILAL derivations as programs.
Linear Logic by Levels and Bounded Time Complexity
, 2009
"... This work deals with the characterization of elementary and deterministic polynomial time computation in linear logic through the proofsasprograms correspondence. Girard’s seminal results, concerning elementary and light linear logic, use a principle called stratification to ensure the complexity b ..."
Abstract
 Add to MetaCart
This work deals with the characterization of elementary and deterministic polynomial time computation in linear logic through the proofsasprograms correspondence. Girard’s seminal results, concerning elementary and light linear logic, use a principle called stratification to ensure the complexity bound on the cutelimination procedure. Here, we propose a more flexible control principle, that of indexing, which allows us to extend Girard’s systems while keeping the same complexity properties. A consequence of the higher flexibility of indexing with respect to stratification is the absence of boxes for handling the § modality. We finally propose a variant of our polytime system in which the § modality is only allowed on atoms, and which may thus serve as a basis for developing λcalculus type assignment systems with more efficient typing algorithms than existing ones.
On Elementary Linear Logic and polynomial time
"... Linear logic (LL) [Gir87] has been used in implicit computational complexity to characterize various complexity classes within the proofsasprograms approach. This can then be the basis of type systems to guarantee complexity properties on lambdacalculus. As duplication is controlled in LL by the ..."
Abstract
 Add to MetaCart
Linear logic (LL) [Gir87] has been used in implicit computational complexity to characterize various complexity classes within the proofsasprograms approach. This can then be the basis of type systems to guarantee complexity properties on lambdacalculus. As duplication is controlled in LL by the connective! the key idea of this line of work is to consider