Results 1  10
of
52
SecretKey Reconciliation by Public Discussion
, 1994
"... . Assuming that Alice and Bob use a secret noisy channel (modelled by a binary symmetric channel) to send a key, reconciliation is the process of correcting errors between Alice's and Bob's version of the key. This is done by public discussion, which leaks some information about the secret key to an ..."
Abstract

Cited by 93 (3 self)
 Add to MetaCart
. Assuming that Alice and Bob use a secret noisy channel (modelled by a binary symmetric channel) to send a key, reconciliation is the process of correcting errors between Alice's and Bob's version of the key. This is done by public discussion, which leaks some information about the secret key to an eavesdropper. We show how to construct protocols that leak a minimum amount of information. However this construction cannot be implemented efficiently. If Alice and Bob are willing to reveal an arbitrarily small amount of additional information (beyond the minimum) then they can implement polynomialtime protocols. We also present a more efficient protocol, which leaks an amount of information acceptably close to the minimum possible for sufficiently reliable secret channels (those with probability of any symbol being transmitted incorrectly as large as 15%). This work improves on earlier reconciliation approaches [R, BBR, BBBSS]. 1 Introduction Unlike public key cryptosystems, the securi...
Universality in quantum computation
 Proc. R. Soc. London A
, 1995
"... We show that in quantum computation almost every gate that operates on two or more bits is a universal gate. We discuss various physical considerations bearing on the proper definition of universality for computational components such as logic gates. ..."
Abstract

Cited by 69 (3 self)
 Add to MetaCart
We show that in quantum computation almost every gate that operates on two or more bits is a universal gate. We discuss various physical considerations bearing on the proper definition of universality for computational components such as logic gates.
On Some Methods for Unconditionally Secure Key Distribution and Broadcast Encryption
 Designs, Codes and Cryptography
, 1996
"... This paper provides an exposition of methods by which a trusted authority can distribute keys and/or broadcast a message over a network, so that each member of a privileged subset of users can compute a specified key or decrypt the broadcast message. Moreover, this is done in such a way that no coal ..."
Abstract

Cited by 50 (8 self)
 Add to MetaCart
This paper provides an exposition of methods by which a trusted authority can distribute keys and/or broadcast a message over a network, so that each member of a privileged subset of users can compute a specified key or decrypt the broadcast message. Moreover, this is done in such a way that no coalition is able to recover any information on a key or broadcast message they are not supposed to know. The problems are studied using the tools of information theory, so the security provided is unconditional (i.e., not based on any computational assumption). We begin by surveying some useful schemes schemes for key distribution that have been presented in the literature, giving background and examples (but not too many proofs). In particular, we look more closely at the attractive concept of key distribution patterns, and present a new method for making these schemes more efficient through the use of resilient functions. Then we present a general approach to the construction of broadcast sch...
Dynamical Sources in Information Theory: Fundamental intervals and Word Prefixes.
, 1998
"... A quite general model of source that comes from dynamical systems theory is introduced. Within this model, some important problems about prefixes that intervene in algorithmic information theory contexts are analysed. The main tool is a new object, the generalized Ruelle operator, which can be viewe ..."
Abstract

Cited by 28 (7 self)
 Add to MetaCart
A quite general model of source that comes from dynamical systems theory is introduced. Within this model, some important problems about prefixes that intervene in algorithmic information theory contexts are analysed. The main tool is a new object, the generalized Ruelle operator, which can be viewed as a "generating" operator. Its dominant spectral objects are linked with important parameters of the source such as the entropy, and play a central role in all the results. 1 Introduction. In information theory contexts, data items are (infinite) words that are produced by a common mechanism, called a source. Realistic sources are often complex objects. We work here inside a quite general framework of sources related to dynamical systems theory which goes beyond the cases of memoryless and Markov sources. This model can describe nonmarkovian processes, where the dependency on past history is unbounded, and as such, they attain a high level of generality. A probabilistic dynamical source ...
Unconditional security of practical quantum key distribution,” arXiv:quantph/0107017
, 2001
"... We present a proof of unconditional security of a practical quantum key distribution protocol. It is an extension of a previous result obtained by Mayers [1, 2], which proves unconditional security provided that a perfect single photon source is used. In present days, perfect single photon sources a ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
We present a proof of unconditional security of a practical quantum key distribution protocol. It is an extension of a previous result obtained by Mayers [1, 2], which proves unconditional security provided that a perfect single photon source is used. In present days, perfect single photon sources are not available and, therefore, practical implementations use either dim laser pulses or postselected states from parametric
Quantum physics and computers
 Contemporary Physics 38
, 1996
"... Recent theoretical results confirm that quantum theory provides the possibility of new ways of performing efficient calculations. The most striking example is the factoring problem. It has recently been shown that computers that exploit quantum features could factor large composite integers. This ta ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
Recent theoretical results confirm that quantum theory provides the possibility of new ways of performing efficient calculations. The most striking example is the factoring problem. It has recently been shown that computers that exploit quantum features could factor large composite integers. This task is believed to be out of reach of classical computers as soon as the number of digits in the number to factor exceeds a certain limit. The additional power of quantum computers comes from the possibility of employing a superposition of states, of following many distinct computation paths and of producing a final output that depends on the interference of all of them. This “quantum parallelism” outstrips by far any parallelism that can be thought of in classical computation and is responsible for the “exponential ” speedup of computation. Experimentally, however, it will be extremely difficult to “decouple ” a quantum computer from its environment. Noise fluctuations due to the outside world, no matter how little, are sufficient to drastically reduce the performance of these
Transparent Proofs and Limits to Approximation
, 1994
"... We survey a major collective accomplishment of the theoretical computer science community on efficiently verifiable proofs. Informally, a formal proof is transparent (or holographic) if it can be verified with large confidence by a small number of spotchecks. Recent work by a large group of researc ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
We survey a major collective accomplishment of the theoretical computer science community on efficiently verifiable proofs. Informally, a formal proof is transparent (or holographic) if it can be verified with large confidence by a small number of spotchecks. Recent work by a large group of researchers has shown that this seemingly paradoxical concept can be formalized and is feasible in a remarkably strong sense; every formal proof in ZF, say, can be rewritten in transparent format (proving the same theorem in a different proof system) without increasing the length of the proof by too much. This result in turn has surprising implications for the intractability of approximate solutions of a wide range of discrete optimization problems, extending the pessimistic predictions of the PNP theory to approximate solvability. We discuss the main results on transparent proofs and their implications to discrete optimization. We give an account of several links between the two subjects as well ...
Basic theorems about security
 Journal of Computer Security
, 1992
"... We build a mathematical structure in which we can ask questions about the methods for achieving security properties, such as confidentiality and integrity, and functionality properties, such as safety and liveness. The structure allows us to consider many different choices for the meaning of “confid ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
We build a mathematical structure in which we can ask questions about the methods for achieving security properties, such as confidentiality and integrity, and functionality properties, such as safety and liveness. The structure allows us to consider many different choices for the meaning of “confidentiality” and “integrity ” and so on, and to compare and contrast security properties with functionality properties.
Language evolution and information theory
 J.THEOR. BIOL
, 2000
"... This paper places models of language evolution within the framework of information theory. We study how signals become associated with meaning. If there is a probability of mistaking signals for each other, then evolution leads to an error limit: increasing the number of signals does not increase th ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
This paper places models of language evolution within the framework of information theory. We study how signals become associated with meaning. If there is a probability of mistaking signals for each other, then evolution leads to an error limit: increasing the number of signals does not increase the fitness of a language beyond a certain limit. This error limit can be overcome by word formation: a linear increase of the word length leads to an exponential increase of the maximum fitness. We develop a general model of word formation and demonstrate the connection between the error limit and Shannon's noisy coding theorem.
Fast String Matching using an ngram Algorithm
 Software  Practice and Experience
, 1994
"... This paper investigates the performance of the new algorithm on such data and compares it with other known algorithms. The results are very encouraging, suggesting that the expected running time theoretical results for ergodic sequences provide a very good estimate of the algorithm's performance on ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
This paper investigates the performance of the new algorithm on such data and compares it with other known algorithms. The results are very encouraging, suggesting that the expected running time theoretical results for ergodic sequences provide a very good estimate of the algorithm's performance on DNA sequences. In fact the algorithm provides its most striking performance on longer pattern strings and small size of alphabet