Results 1 
9 of
9
Foundational and mathematical uses of higher types
 REFLECTIONS ON THE FOUNDATIONS OF MATHEMATICS: ESSAY IN HONOR OF SOLOMON FEFERMAN
, 1999
"... In this paper we develop mathematically strong systems of analysis in higher types which, nevertheless, are prooftheoretically weak, i.e. conservative over elementary resp. primitive recursive arithmetic. These systems are based on noncollapsing hierarchies ( n WKL+ ; n WKL+ ) of principles ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
In this paper we develop mathematically strong systems of analysis in higher types which, nevertheless, are prooftheoretically weak, i.e. conservative over elementary resp. primitive recursive arithmetic. These systems are based on noncollapsing hierarchies ( n WKL+ ; n WKL+ ) of principles which generalize (and for n = 0 coincide with) the socalled `weak' König's lemma WKL (which has been studied extensively in the context of second order arithmetic) to logically more complex tree predicates. Whereas the second order context used in the program of reverse mathematics requires an encoding of higher analytical concepts like continuous functions F : X ! Y between Polish spaces X;Y , the more exible language of our systems allows to treat such objects directly. This is of relevance as the encoding of F used in reverse mathematics tacitly yields a constructively enriched notion of continuous functions which e.g. for F : IN ! IN can be seen (in our higher order context)
How to Convert the Flavor of a Quantum Bit Commitment
 Eurocrypt 2001, Lecture Notes in Computer Science
, 2001
"... Abstract. In this paper we show how to convert a statistically binding but computationally concealing quantum bit commitment scheme into a computationally binding but statistically concealing qbc scheme. For a security parameter n, the construction of the statistically concealing scheme requires O(n ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Abstract. In this paper we show how to convert a statistically binding but computationally concealing quantum bit commitment scheme into a computationally binding but statistically concealing qbc scheme. For a security parameter n, the construction of the statistically concealing scheme requires O(n 2) executions of the statistically binding scheme. As a consequence, statistically concealing but computationally binding quantum bit commitments can be based upon any family of quantum oneway functions. Such a construction is not known to exist in the classical world. 1
Proof mining in L_1approximation
, 2001
"... In this paper we present another case study in the general project of proof mining which means the logical analysis of prima facie noneffective proofs with the aim of extracting new computationally relevant data. We use techniques based on monotone functional interpretation (developed in [17]) to a ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
In this paper we present another case study in the general project of proof mining which means the logical analysis of prima facie noneffective proofs with the aim of extracting new computationally relevant data. We use techniques based on monotone functional interpretation (developed in [17]) to analyze Cheney's simplification [6] of Jackson's original proof [10] from 1921 of the uniqueness of the best L 1 approximation of continuous functions f # C[0, 1] by polynomials p # Pn of degree # n. Cheney's proof is noneffective in the sense that it is based on classical logic and on the noncomputational principle WKL (binary Konig lemma). The result of our analysis provides the first e#ective (in all parameters f, n and #) uniform modulus of uniqueness (a concept which generalizes `strong uniqueness' studied extensively in approximation theory). Moreover, the extracted modulus has the optimal #dependency as follows from Kroo [21]. The paper also describes how the uniform modulus of uniqueness can be used to compute the best L 1 approximations of a fixed f # C[0, 1] with arbitrary precision. We use this result to give a complexity upper bound on the computation of the best L 1 approximation in [24].
Computational interpretations of analysis via products of selection functions
 CIE 2010, INVITED TALK ON SPECIAL SESSION “PROOF THEORY AND COMPUTATION
, 2010
"... We show that the computational interpretation of full comprehension via two wellknown functional interpretations (dialectica and modified realizability) corresponds to two closely related infinite products of selection functions. ..."
Abstract

Cited by 9 (8 self)
 Add to MetaCart
We show that the computational interpretation of full comprehension via two wellknown functional interpretations (dialectica and modified realizability) corresponds to two closely related infinite products of selection functions.
OPEN QUESTIONS IN REVERSE MATHEMATICS
, 2010
"... The objective of this paper is to provide a source of open questions in reverse mathematics and to point to areas where there could be interesting developments. The questions I discuss are mostly known and come from somewhere in the literature. My objective was to compile them in one place and discu ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The objective of this paper is to provide a source of open questions in reverse mathematics and to point to areas where there could be interesting developments. The questions I discuss are mostly known and come from somewhere in the literature. My objective was to compile them in one place and discuss them in the context of related work. The list is definitely not comprehensive, and my
TABLE OF CONTENTS
, 2005
"... Thesis under the direction of Professor Mitch Wilkes Object recognition and learning algorithms are huge areas of robotics research with many different methods in use by various researchers. A common result of using complex recognition methods is the loss of meaning (for humans) in the subsequent pr ..."
Abstract
 Add to MetaCart
Thesis under the direction of Professor Mitch Wilkes Object recognition and learning algorithms are huge areas of robotics research with many different methods in use by various researchers. A common result of using complex recognition methods is the loss of meaning (for humans) in the subsequent processing of the data. When programs incorrectly identify objects, the reason why is often lost in the data analysis. If the researchers can understand what the robot sees, they are better able to develop a system that has limited image understanding. Fuzzy models in object recognition are one of the better methods for achieving such a learning system. Our desire is to develop a system that is quickly and easily trained, a system that can relate the decision of objects through the feature vector (vector of measured characteristics about the object), and a system that is relatively simple in its calculation of results. The research was applied in conjunction with an experiment done by the Psychology Department. This system was applied to recorded videos of tasks done by their subjects. The system proves to be quite effective in object recognition and provides