Results 11  20
of
164
Higher Type Recursion, Ramification and Polynomial Time
 Annals of Pure and Applied Logic
, 1999
"... It is shown how to restrict recursion on notation in all finite types so as to characterize the polynomial time computable functions. The restrictions are obtained by enriching the type structure with the formation of types !oe, and by adding linear concepts to the lambda calculus. 1 Introduction ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
(Show Context)
It is shown how to restrict recursion on notation in all finite types so as to characterize the polynomial time computable functions. The restrictions are obtained by enriching the type structure with the formation of types !oe, and by adding linear concepts to the lambda calculus. 1 Introduction Recursion in all finite types was introduced by Hilbert [9] and later became known as the essential part of Godel's system T [8]. This system has long been viewed as a powerful scheme unsuitable for describing small complexity classes such as polynomial time. Simmons [16] showed that ramification can be used to characterize the primitive recursive functions by higher type recursion, and Leivant and Marion [14] showed that another form of ramification can be used to restrict higher type recursion to PSPACE. However, to characterize the much smaller class of polynomialtime computable functions by higher type recursion, it seems that an additional principle is required. By introducing linear...
Correspondence between Operational and Denotational Semantics
 Handbook of Logic in Computer Science
, 1995
"... This course introduces the operational and denotational semantics of PCF and examines the relationship between the two. Topics: Syntax and operational semantics of PCF, Activity Lemma, undefinability of parallel or; Context Lemma (first principles proof) and proof by logical relations Denotational ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
This course introduces the operational and denotational semantics of PCF and examines the relationship between the two. Topics: Syntax and operational semantics of PCF, Activity Lemma, undefinability of parallel or; Context Lemma (first principles proof) and proof by logical relations Denotational semantics of PCF induced by an interpretation; (standard) Scott model, adequacy, weak adequacy and its proof (by a computability predicate) Domain Theory up to SFP and Scott domains; non full abstraction of the standard model, definability of compact elements and full abstraction for PCFP (PCF + parallel or), properties of orderextensional (continuous) models of PCF, Milner's model and Mulmuley's construction (excluding proofs) Additional topics (time permitting): results on pure simplytyped lambda calculus, Friedman 's Completeness Theorem, minimal model, logical relations and definability, undecidability of lambda definability (excluding proof), dIdomains and stable functions Homepa...
When Physical Systems Realize Functions...
 MINDS AND MACHINES
, 1999
"... After briefly discussing the relevance of the notions "computation" and "implementation" for cognitive science, I summarize some of the problems that have been found in their most common interpretations. In particular, I argue that standard notions of computation together with ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
After briefly discussing the relevance of the notions "computation" and "implementation" for cognitive science, I summarize some of the problems that have been found in their most common interpretations. In particular, I argue that standard notions of computation together with a "statetostate correspondence view of implementation" cannot overcome difficulties posed by Putnam's Realization Theorem and that, therefore, a different approach to implementation is required. The notion "realization of a function", developed out of physical theories, is then introduced as a replacement for the notional pair "computationimplementation". After gradual refinement, taking practical constraints into account, this notion gives rise to the notion "digital system" which singles out physical systems that could be actually used, and possibly even built.
Short Proofs of Normalization for the simplytyped λcalculus, permutative conversions and Gödel's T
 TO APPEAR: ARCHIVE FOR MATHEMATICAL LOGIC
, 1998
"... Inductive characterizations of the sets of terms, the subset of strongly normalizing terms and normal forms are studied in order to reprove weak and strong normalization for the simplytyped λcalculus and for an extension by sum types with permutative conversions. The analogous treatment of a new sy ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
Inductive characterizations of the sets of terms, the subset of strongly normalizing terms and normal forms are studied in order to reprove weak and strong normalization for the simplytyped λcalculus and for an extension by sum types with permutative conversions. The analogous treatment of a new system with generalized applications inspired by von Plato's generalized elimination rules in natural deduction shows the flexibility of the approach which does not use the strong computability/candidate style a la Tait and Girard. It is also shown that the extension of the system with permutative conversions by rules is still strongly normalizing, and likewise for an extension of the system of generalized applications by a rule of "immediate simplification". By introducing an innitely branching inductive rule the method even extends to Gödel's T.
Proof Interpretations and the Computational Content of Proofs. Draft of book in preparation
, 2007
"... This survey reports on some recent developments in the project of applying proof theory to proofs in core mathematics. The historical roots, however, go back to Hilbert’s central theme in the foundations of mathematics which can be paraphrased by the following question ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
(Show Context)
This survey reports on some recent developments in the project of applying proof theory to proofs in core mathematics. The historical roots, however, go back to Hilbert’s central theme in the foundations of mathematics which can be paraphrased by the following question
Uniform Heyting arithmetic
 Annals Pure Applied Logic
, 2005
"... Abstract. We present an extension of Heyting Arithmetic in finite types called Uniform Heyting Arithmetic (HA u) that allows for the extraction of optimized programs from constructive and classical proofs. The system HA u has two sorts of firstorder quantifiers: ordinary quantifiers governed by the ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
(Show Context)
Abstract. We present an extension of Heyting Arithmetic in finite types called Uniform Heyting Arithmetic (HA u) that allows for the extraction of optimized programs from constructive and classical proofs. The system HA u has two sorts of firstorder quantifiers: ordinary quantifiers governed by the usual rules, and uniform quantifiers subject to stronger variable conditions expressing roughly that the quantified object is not computationally used in the proof. We combine a Kripkestyle Friedman/Dragalin translation which is inspired by work of Coquand and Hofmann and a variant of the refined Atranslation due to Buchholz, Schwichtenberg and the author to extract programs from a rather large class of classical firstorder proofs while keeping explicit control over the levels of recursion and the decision procedures for predicates used in the extracted program. §1. Introduction. According to the BrouwerHeytingKolmogorov interpretation of constructive logic a proof is a construction providing evidence for the proven formula [20]. Viewing this interpretation from a dataoriented perspective one arrives at the socalled proofsasprograms paradigm associating a constructive proof with a program ‘realizing ’ the proven formula. This paradigm has been
Foundational and mathematical uses of higher types
 REFLECTIONS ON THE FOUNDATIONS OF MATHEMATICS: ESSAY IN HONOR OF SOLOMON FEFERMAN
, 1999
"... In this paper we develop mathematically strong systems of analysis in higher types which, nevertheless, are prooftheoretically weak, i.e. conservative over elementary resp. primitive recursive arithmetic. These systems are based on noncollapsing hierarchies ( n WKL+ ; n WKL+ ) of principles ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
(Show Context)
In this paper we develop mathematically strong systems of analysis in higher types which, nevertheless, are prooftheoretically weak, i.e. conservative over elementary resp. primitive recursive arithmetic. These systems are based on noncollapsing hierarchies ( n WKL+ ; n WKL+ ) of principles which generalize (and for n = 0 coincide with) the socalled `weak' König's lemma WKL (which has been studied extensively in the context of second order arithmetic) to logically more complex tree predicates. Whereas the second order context used in the program of reverse mathematics requires an encoding of higher analytical concepts like continuous functions F : X ! Y between Polish spaces X;Y , the more exible language of our systems allows to treat such objects directly. This is of relevance as the encoding of F used in reverse mathematics tacitly yields a constructively enriched notion of continuous functions which e.g. for F : IN ! IN can be seen (in our higher order context)