Results 1  10
of
29
The strength of nonsize increasing computation
, 2002
"... We study the expressive power of nonsize increasing recursive definitions over lists. This notion of computation is such that the size of all intermediate results will automatically be bounded by the size of the input so that the interpretation in a finite model is sound with respect to the standar ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
We study the expressive power of nonsize increasing recursive definitions over lists. This notion of computation is such that the size of all intermediate results will automatically be bounded by the size of the input so that the interpretation in a finite model is sound with respect to the standard semantics. Many wellknown algorithms with this property such as the usual sorting algorithms are definable in the system in the natural way. The main result is that a characteristic function is definable if and only if it is computable in time O(2 p(n)) for some polynomial p. The method used to establish the lower bound on the expressive power also shows that the complexity becomes polynomial time if we allow primitive recursion only. This settles an open question posed in [1, 7]. The key
A logical account of PSPACE
 In 35th ACM SIGPLANSIGACT Symposium on Principles of Programming Languages POPL 2008
"... We propose a characterization of PSPACE by means of a type assignment for an extension of lambda calculus with a conditional construction. The type assignment STAB is an extension of STA, a type assignment for lambdacalculus inspired by Lafont’s Soft Linear Logic. We extend STA by means of a groun ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
We propose a characterization of PSPACE by means of a type assignment for an extension of lambda calculus with a conditional construction. The type assignment STAB is an extension of STA, a type assignment for lambdacalculus inspired by Lafont’s Soft Linear Logic. We extend STA by means of a ground type and terms for booleans. The key point is that the elimination rule for booleans is managed in an additive way. Thus, we are able to program polynomial time Alternating Turing Machines. Conversely, we introduce a callbyname evaluation machine in order to compute programs in polynomial space. As far as we know, this is the first characterization of PSPACE which is based on lambda calculus and light logics. Categories and Subject Descriptors F.3.3 [Logics and meanings of programs]: Studies of program constructs—type structure;
A Simple Supercompiler Formally Verified in Coq
 SECOND INTERNATIONAL WORKSHOP ON METACOMPUTATION IN RUSSIA (META 2010)
, 2010
"... We study an approach for verifying the correctness of a simplified supercompiler in Coq. While existing supercompilers are not very big in size, they combine many different program transformations in intricate ways, so checking the correctness of their implementation poses challenges. The presented ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We study an approach for verifying the correctness of a simplified supercompiler in Coq. While existing supercompilers are not very big in size, they combine many different program transformations in intricate ways, so checking the correctness of their implementation poses challenges. The presented method relies on two important technical features to achieve a compact and modular formalization: first, a very limited object language; second, decomposing the supercompilation process into many subtransformations, whose correctness can be checked independently. In particular, we give separate correctness proofs for two key parts of driving – normalization and positive information propagation – in the context of a nonTuringcomplete expression sublanguage. Though our supercompiler is currently limited, its formal correctness proof can give guidance for verifying more realistic implementations.
A Flow Calculus of mwpBounds for Complexity Analysis
"... We present a method for certifying that the values computed by an imperative program will be bounded by polynomials in the program’s inputs. To this end, we introduce mwpmatrices and define a semantic relation  = C: M where C is a program and M is an mwpmatrix. It follows straightforwardly from o ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
We present a method for certifying that the values computed by an imperative program will be bounded by polynomials in the program’s inputs. To this end, we introduce mwpmatrices and define a semantic relation  = C: M where C is a program and M is an mwpmatrix. It follows straightforwardly from our definitions that there exists M such that  = C:M holds iff every value computed by C is bounded by a polynomial in the inputs. Furthermore, we provide a syntactical proof calculus and define the relation ⊢ C:M to hold iff there exists a derivation in the calculus where C:M is the bottom line. We prove that ⊢ C:M implies  = C:M. By means of exhaustive proof search, an algorithm can decide if there exists M such that the relation ⊢ C:M holds, and thus, our results yield a computational method. Categories and Subject Descriptors: D.2.4 [Software engineering]: Software/Program Verification; F.2.0 [Analysis of algorithms and problem complexity]: General; F.3.1 [Logics and meanings of programs]: Specifying and Verifying and Reasoning about Programs
A Functional Language for Logarithmic Space
 In APLAS
, 2004
"... More than being just a tool for expressing algorithms, a welldesigned programming language allows the user to express her ideas efficiently. The design choices however effect the efficiency of the algorithms written in the languages. It is therefore of importance to understand how such choices effe ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
More than being just a tool for expressing algorithms, a welldesigned programming language allows the user to express her ideas efficiently. The design choices however effect the efficiency of the algorithms written in the languages. It is therefore of importance to understand how such choices effect the expressibility of programming languages. The paper pursues the very low complexity programs by presenting a firstorder function algebra BC # that captures exactly LF, the functions computable in logarithmic space. This gives insights into the expressiveness of recursion. Moreover, it can be useful for the automatic analysis of programs' resource usage and the separation of complexity classes. The important technical features of BC # are (1) a separation of variables into safe and normal variables where recursion can only be done over the latter; (2) linearity of the recursive call; and (3) recursion with a variable step length (courseofvalue recursion). Unlike formulations LF via Turin Machines, BC # makes no references to outside resource measures, e.g., the size of the memory used. This appears to be the first such characterization of LFcomputable functions (not just predicates). The proof that all BC #programs can be evaluated in LF is of separate interest to programmers: it trades space for time and evaluates recursion with at most one recursive call without a call stack.
Elementary linear logic revisited for polynomial time and an exponential time hierarchy
, 2011
"... ..."
Constructing Programs From Metasystem Transition Proofs
 FIRST INTERNATIONAL WORKSHOP ON METACOMPUTATION IN RUSSIA (META 2008)
, 2008
"... It has previously been shown by Turchin in the context of supercompilation how metasystem transitions can be used in the proof of universally and existentially quantified conjectures. Positive supercompilation is a variant of Turchin’s supercompilation which was introduced in an attempt to study and ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
It has previously been shown by Turchin in the context of supercompilation how metasystem transitions can be used in the proof of universally and existentially quantified conjectures. Positive supercompilation is a variant of Turchin’s supercompilation which was introduced in an attempt to study and explain the essentials of Turchin’s supercompiler. In our own previous work, we have proposed a program transformation algorithm called distillation, which is more powerful than positive supercompilation, and have shown how this can be used to prove a wider range of universally and existentially quantified conjectures in our theorem prover Poitín. In this paper we show how a wide range of programs can be constructed fully automatically from firstorder specifications through the use of metasystem transitions, and we prove that the constructed programs are totally correct with respect to their specifications. To our knowledge, this is the first technique which has been developed for the automatic construction of programs from their specifications using metasystem transitions.
Implicit complexity in recursive analysis
 TENTH INTERNATIONAL WORKSHOP ON LOGIC AND COMPUTATIONAL COMPLEXITY LCC'09
, 2009
"... Recursive analysis is a model of analog computation which is based on type 2 Turing machines. Various classes of functions computable in recursive analysis have recently been characterized in a machine independent and algebraical context. In particular nice connections between the class of computabl ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Recursive analysis is a model of analog computation which is based on type 2 Turing machines. Various classes of functions computable in recursive analysis have recently been characterized in a machine independent and algebraical context. In particular nice connections between the class of computable functions (and some of its sub and supclasses) over the reals and algebraically defined (sub and sup) classes of Rrecursive functions à la Moore have been obtained. We provide in this paper a framework that allows to dive into complexity for functions over the reals. It indeed relates classical computability and complexity classes with the corresponding classes in recursive analysis. This framework opens the field of implicit complexity of functions over the reals. While our setting provides a new reading of some of the existing characterizations, it also provides new results: inspired by Bellantoni and Cook’s characterization of polynomial time computable functions, we provide the first algebraic characterization of polynomial time computable functions over the reals.
Algebraic Characterizations of ComplexityTheoretic Classes of Real Functions, Laboratoire d’informatique de l’école polytechnique
 Alexandria University  Alexandria University
, 2009
"... Recursive analysis is the most classical approach to model and discuss computations over the real numbers. Recently, it has been shown that computability classes of functions in the senseof recursive analysis can bedefined (or characterized) in an algebraic machine independent way, without resorting ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Recursive analysis is the most classical approach to model and discuss computations over the real numbers. Recently, it has been shown that computability classes of functions in the senseof recursive analysis can bedefined (or characterized) in an algebraic machine independent way, without resorting to Turing machines. In particular nice connections between the class of computable functions (and some of its sub and supclasses) over the reals and algebraically defined (sub and sup) classes of Rrecursive functions à la Moore 96 have been obtained. However, until now, this has been done only at the computability level, and not at the complexity level. In this paper we provide a framework that allows us to dive into the complexity level of real functions. In particular we provide the first algebraic characterization of polynomialtime computable functions over the reals. This framework opens the field of implicit complexity of analog functions, and also provides a new reading of some of the existing characterizations at the computability level. 1