Results 1 
7 of
7
On the Impossibility of Constructing NonInteractive StatisticallySecret Protocols from any Trapdoor OneWay Function
 In Topics in Cryptology  The Cryptographers’ Track at the RSA Conference
, 2002
"... We show that noninteractive statisticallysecret bit commitment cannot be constructed from arbitrary blackbox onetoone trapdoor functions and thus from general publickey cryptosystems. Reducing the problems of noninteractive cryptocomputing, rerandomizable encryption, and noninteractive stat ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
We show that noninteractive statisticallysecret bit commitment cannot be constructed from arbitrary blackbox onetoone trapdoor functions and thus from general publickey cryptosystems. Reducing the problems of noninteractive cryptocomputing, rerandomizable encryption, and noninteractive statisticallysenderprivate oblivious transfer and lowcommunication private information retrieval to such commitment schemes, it follows that these primitives are neither constructible from onetoone trapdoor functions and publickey encryption in general. Furthermore, our...
Recursive computational depth
 Information and Computation
, 1999
"... In the 1980's, Bennett introduced computational depth as a formal measure of the amount of computational history that is evident in an object's structure. In particular, Bennett identi ed the classes of weakly deep and strongly deep sequences, and showed that the halting problem is strongly deep. Ju ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
In the 1980's, Bennett introduced computational depth as a formal measure of the amount of computational history that is evident in an object's structure. In particular, Bennett identi ed the classes of weakly deep and strongly deep sequences, and showed that the halting problem is strongly deep. Juedes, Lathrop, and Lutz subsequently extended this result by de ning the class of weakly useful sequences, and proving that every weakly useful sequence is strongly deep. The present paper investigates re nements of Bennett's notions of weak and strong depth, called recursively weak depth (introduced by Fenner, Lutz and Mayordomo) and recursively strong depth (introduced here). It is argued that these re nements naturally capture Bennett's idea that deep objects are those which \contain internal evidence of a nontrivial causal history. " The fundamental properties of recursive computational depth are developed, and it is shown that the recursively weakly (respectively, strongly) deep sequences form a proper subclass of the class of weakly (respectively, strongly) deep sequences. The abovementioned theorem of Juedes, Lathrop, and Lutz is then strengthened by proving that every weakly useful sequence is recursively strongly deep. It follows from these results that not every strongly deep sequence is weakly useful, thereby answering a question posed by Juedes.
Prediction and Dimension
 Journal of Computer and System Sciences
, 2002
"... Given a set X of sequences over a nite alphabet, we investigate the following three quantities. (i) The feasible predictability of X is the highest success ratio that a polynomialtime randomized predictor can achieve on all sequences in X. ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
Given a set X of sequences over a nite alphabet, we investigate the following three quantities. (i) The feasible predictability of X is the highest success ratio that a polynomialtime randomized predictor can achieve on all sequences in X.
Why Computational Complexity Requires Stricter Martingales
"... The word "martingale " has related, but different, meanings in probability theory and theoretical computer science. In computational complexity and algorithmic information theory, a martingale is typically a function d on strings such that E(d(wb)w) = d(w) for all strings w, where the c ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
The word "martingale " has related, but different, meanings in probability theory and theoretical computer science. In computational complexity and algorithmic information theory, a martingale is typically a function d on strings such that E(d(wb)w) = d(w) for all strings w, where the conditional expectation is computed over all possible values of the next symbol b. In modern probability theory a martingale is typically a sequence,0,,1,,2,... of random variables such that E(,n+1,0,...,,n) =,n for all n.
Computational complexity via programming languages: constant factors do matter
 Acta Informat
"... ..."
The Emergent Computational Potential of Evolving Artificial Living Systems
, 2002
"... The computational potential of artificial living systems can be studied without knowing the algorithms that govern their behavior. Modeling single organisms by means of socalled cognitive transducers, we will estimate the computational power of AL systems by viewing them as conglomerates of such org ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
The computational potential of artificial living systems can be studied without knowing the algorithms that govern their behavior. Modeling single organisms by means of socalled cognitive transducers, we will estimate the computational power of AL systems by viewing them as conglomerates of such organisms. We describe a scenario in which an artificial living (AL) system is involved in a potentially infinite, unpredictable interaction with an active or passive environment, to which it can react by learning and adjusting its behaviour. By making use of sequences of cognitive transducers one can also model the evolution of AL systems caused by `architectural' changes. Among the examples are `communities of agents', i.e. by communities of mobile, interactive cognitive transducers.
P versus NP
 MONOGRAFÍAS DE LA REAL ACADEMIA DE CIENCIAS DE ZARAGOZA. 26: 57–68, (2004)
, 2004
"... ..."