Results 1  10
of
10
On the Impossibility of Constructing NonInteractive StatisticallySecret Protocols from any Trapdoor OneWay Function
 In Topics in Cryptology  The Cryptographers’ Track at the RSA Conference
, 2002
"... We show that noninteractive statisticallysecret bit commitment cannot be constructed from arbitrary blackbox onetoone trapdoor functions and thus from general publickey cryptosystems. Reducing the problems of noninteractive cryptocomputing, rerandomizable encryption, and noninteractive stat ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
We show that noninteractive statisticallysecret bit commitment cannot be constructed from arbitrary blackbox onetoone trapdoor functions and thus from general publickey cryptosystems. Reducing the problems of noninteractive cryptocomputing, rerandomizable encryption, and noninteractive statisticallysenderprivate oblivious transfer and lowcommunication private information retrieval to such commitment schemes, it follows that these primitives are neither constructible from onetoone trapdoor functions and publickey encryption in general. Furthermore, our...
Recursive computational depth
 Information and Computation
, 1999
"... In the 1980's, Bennett introduced computational depth as a formal measure of the amount of computational history that is evident in an object's structure. In particular, Bennett identi ed the classes of weakly deep and strongly deep sequences, and showed that the halting problem is strongl ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
In the 1980's, Bennett introduced computational depth as a formal measure of the amount of computational history that is evident in an object's structure. In particular, Bennett identi ed the classes of weakly deep and strongly deep sequences, and showed that the halting problem is strongly deep. Juedes, Lathrop, and Lutz subsequently extended this result by de ning the class of weakly useful sequences, and proving that every weakly useful sequence is strongly deep. The present paper investigates re nements of Bennett's notions of weak and strong depth, called recursively weak depth (introduced by Fenner, Lutz and Mayordomo) and recursively strong depth (introduced here). It is argued that these re nements naturally capture Bennett's idea that deep objects are those which \contain internal evidence of a nontrivial causal history. &quot; The fundamental properties of recursive computational depth are developed, and it is shown that the recursively weakly (respectively, strongly) deep sequences form a proper subclass of the class of weakly (respectively, strongly) deep sequences. The abovementioned theorem of Juedes, Lathrop, and Lutz is then strengthened by proving that every weakly useful sequence is recursively strongly deep. It follows from these results that not every strongly deep sequence is weakly useful, thereby answering a question posed by Juedes.
Prediction and Dimension
 Journal of Computer and System Sciences
, 2002
"... Given a set X of sequences over a nite alphabet, we investigate the following three quantities. (i) The feasible predictability of X is the highest success ratio that a polynomialtime randomized predictor can achieve on all sequences in X. ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
Given a set X of sequences over a nite alphabet, we investigate the following three quantities. (i) The feasible predictability of X is the highest success ratio that a polynomialtime randomized predictor can achieve on all sequences in X.
Why Computational Complexity Requires Stricter Martingales
"... The word "martingale " has related, but different, meanings in probability theory and theoretical computer science. In computational complexity and algorithmic information theory, a martingale is typically a function d on strings such that E(d(wb)w) = d(w) for all strings w, whe ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
The word &quot;martingale &quot; has related, but different, meanings in probability theory and theoretical computer science. In computational complexity and algorithmic information theory, a martingale is typically a function d on strings such that E(d(wb)w) = d(w) for all strings w, where the conditional expectation is computed over all possible values of the next symbol b. In modern probability theory a martingale is typically a sequence,0,,1,,2,... of random variables such that E(,n+1,0,...,,n) =,n for all n.
Computational complexity via programming languages: constant factors do matter
 Acta Informat
"... ..."
The Emergent Computational Potential of Evolving Artificial Living Systems
, 2002
"... The computational potential of artificial living systems can be studied without knowing the algorithms that govern their behavior. Modeling single organisms by means of socalled cognitive transducers, we will estimate the computational power of AL systems by viewing them as conglomerates of such org ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
The computational potential of artificial living systems can be studied without knowing the algorithms that govern their behavior. Modeling single organisms by means of socalled cognitive transducers, we will estimate the computational power of AL systems by viewing them as conglomerates of such organisms. We describe a scenario in which an artificial living (AL) system is involved in a potentially infinite, unpredictable interaction with an active or passive environment, to which it can react by learning and adjusting its behaviour. By making use of sequences of cognitive transducers one can also model the evolution of AL systems caused by `architectural' changes. Among the examples are `communities of agents', i.e. by communities of mobile, interactive cognitive transducers.
An efficient simulation of polynomialspace Turing machines by P systems with active membranes
 in: G. Păun, M.J. Pérez Jiménez, A. RiscosNúñez (Eds.), Preproceedings of the Tenth Workshop on Membrane Computing, 2009
"... Summary. We show that a deterministic singletape Turing machine, operating in polynomial space with respect to the input length, can be efficiently simulated (both in terms of time and space) by a semiuniform family of P systems with active membranes and three polarizations, using only communicati ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Summary. We show that a deterministic singletape Turing machine, operating in polynomial space with respect to the input length, can be efficiently simulated (both in terms of time and space) by a semiuniform family of P systems with active membranes and three polarizations, using only communication rules. Then, basing upon this simulation, we prove that a result similar to the space hierarchy theorem can be obtained for P systems with active membranes: the larger the amount of space we can use during the computations, the harder the problems we are able to solve. 1
P versus NP
 MONOGRAFÍAS DE LA REAL ACADEMIA DE CIENCIAS DE ZARAGOZA. 26: 57–68, (2004)
, 2004
"... ..."
The Emergent Computational Potential of Evolving Artificial Living Systems ∗
"... The computational potential of artificial living systems can be studied without knowing the algorithms that govern their behavior. Modeling single organisms by means of socalled cognitive transducers, we will estimate the computational power of AL systems by viewing them as conglomerates of such org ..."
Abstract
 Add to MetaCart
The computational potential of artificial living systems can be studied without knowing the algorithms that govern their behavior. Modeling single organisms by means of socalled cognitive transducers, we will estimate the computational power of AL systems by viewing them as conglomerates of such organisms. We describe a scenario in which an artificial living (AL) system is involved in a potentially infinite, unpredictable interaction with an active or passive environment, to which it can react by learning and adjusting its behaviour. By making use of sequences of cognitive transducers one can also model the evolution of AL systems caused by ‘architectural ’ changes. Among the examples are ‘communities of agents’, i.e. by communities of mobile, interactive cognitive transducers. Most AL systems show the emergence of a computational power that is not present at the level of the individual organisms. Indeed, in all but trivial cases the resulting systems possess a superTuring computing power. This means that the systems cannot be simulated by traditional computational models like Turing machines and may in principle solve noncomputable tasks. The results are derived using nonuniform complexity theory. “What we can do is understand some of the general principles of how living things work, and why they exist at all.” From: R. Dawkins, The Blind Watchmaker, 1986. 1
Why Computational Complexity Requires Stricter
"... Abstract The word "martingale " has related, but different, meanings in probability theory and theoretical computer science. In computational complexity and algorithmic information theory, a martingale is typically a function d on strings such that E(d(wb)jw) = d(w) for all string ..."
Abstract
 Add to MetaCart
Abstract The word &quot;martingale &quot; has related, but different, meanings in probability theory and theoretical computer science. In computational complexity and algorithmic information theory, a martingale is typically a function d on strings such that E(d(wb)jw) = d(w) for all strings w, where the conditional expectation is computed over all possible values of the next symbol b. In modern probability theory a martingale is typically a sequence,0;,1;,2; : : : of random variables such that E(,n+1j,0; : : : ;,n) =,n for all n.