Results 1 
5 of
5
Secret Key Agreement by Public Discussion From Common Information
 IEEE Transactions on Information Theory
, 1993
"... . The problem of generating a shared secret key S by two parties knowing dependent random variables X and Y , respectively, but not sharing a secret key initially, is considered. An enemy who knows the random variable Z, jointly distributed with X and Y according to some probability distribution PX ..."
Abstract

Cited by 255 (18 self)
 Add to MetaCart
. The problem of generating a shared secret key S by two parties knowing dependent random variables X and Y , respectively, but not sharing a secret key initially, is considered. An enemy who knows the random variable Z, jointly distributed with X and Y according to some probability distribution PXY Z , can also receive all messages exchanged by the two parties over a public channel. The goal of a protocol is that the enemy obtains at most a negligible amount of information about S. Upper bounds on H(S) as a function of PXY Z are presented. Lower bounds on the rate H(S)=N (as N !1) are derived for the case where X = [X 1 ; : : : ; XN ], Y = [Y 1 ; : : : ; YN ] and Z = [Z 1 ; : : : ; ZN ] result from N independent executions of a random experiment generating X i ; Y i and Z i , for i = 1; : : : ; N . In particular it is shown that such secret key agreement is possible for a scenario where all three parties receive the output of a binary symmetric source over independent binary symmetr...
Unconditionally Secure Key Agreement and the Intrinsic Conditional Information
, 1999
"... This paper is concerned with secretkey agreement by public discussion. Assume that two parties Alice and Bob and an adversary Eve have access to independent realizations of random variables X , Y , and Z, respectively, with joint distribution PXY Z . The secret key rate S(X ; Y jjZ) has been define ..."
Abstract

Cited by 36 (7 self)
 Add to MetaCart
This paper is concerned with secretkey agreement by public discussion. Assume that two parties Alice and Bob and an adversary Eve have access to independent realizations of random variables X , Y , and Z, respectively, with joint distribution PXY Z . The secret key rate S(X ; Y jjZ) has been defined as the maximal rate at which Alice and Bob can generate a secret key by communication over an insecure, but authenticated channel such that Eve's information about this key is arbitrarily small. We define a new conditional mutual information measure, the intrinsic conditional mutual information between X and Y when given Z, denoted by I(X ; Y # Z), which is an upper bound on S(X ; Y jjZ). The special scenarios are analyzed where X , Y , and Z are generated by sending a binary random variable R, for example a signal broadcast by a satellite, over independent channels, or two scenarios in which Z is generated by sending X and Y over erasure channels. In the first two scenarios it can be sho...
Towards Characterizing when InformationTheoretic Secret Key Agreement is Possible
 Advances in Cryptology  ASIACRYPT '96, K. Kim and T. Matsumoto (Eds.), Lecture Notes in Computer Science
, 1996
"... . This paper is concerned with informationtheoretically secure secret key agreement in the general scenario where three parties, Alice, Bob, and Eve, know random variables X, Y , and Z, respectively, with joint distribution PXY Z , for instance resulting from receiving a binary sequence of random b ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
. This paper is concerned with informationtheoretically secure secret key agreement in the general scenario where three parties, Alice, Bob, and Eve, know random variables X, Y , and Z, respectively, with joint distribution PXY Z , for instance resulting from receiving a binary sequence of random bits broadcast by a satellite. We consider the problem of determining for a given distribution PXYZ whether Alice and Bob can in principle, by communicating over an insecure channel accessible to Eve, generate a secret key about which Eve's information is arbitrarily small. The emphasis of this paper is on the possibility or impossibility of such key agreement for a large class of distributions PXY Z more than on the efficiency of the protocols. When X, Y , and Z are arbitrary random variables that result from a binary random variable being sent through three independent channels, it is shown that secret key agreement is possible if and only if I(X; Y jZ) ? 0, i.e., under the sole condition ...
Discriminatory Source Coding for a Noiseless Broadcast Channel
, 2005
"... We introduce a new problem of broadcast source coding with a discrimination requirement  there is an eavesdropping user from whom we wish to withhold the true message in an entropic sense. Binning can achieve the SlepianWolf rate, but at the cost of full information leakage to the eavesdropper. ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We introduce a new problem of broadcast source coding with a discrimination requirement  there is an eavesdropping user from whom we wish to withhold the true message in an entropic sense. Binning can achieve the SlepianWolf rate, but at the cost of full information leakage to the eavesdropper. Our main result is a lower bound that implies that any entropically efficient broadcast scheme must be "like binning" in that it also must leak significant information to eavesdroppers I.
The Intrinsic Conditional Mutual Information and Perfect Secrecy
 IN PROC. 1997 IEEE SYMPOSIUM ON INFORMATION THEORY, (ABSTRACTS
, 1997
"... This paper is concerned with secret key agreement by public discussion: two parties Alice and Bob and an adversary Eve have access to independent realizations of random variables X , Y , and Z, respectively, with joint distribution PXY Z . The secret key rate S(X ; Y jjZ) has been defined as the m ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
This paper is concerned with secret key agreement by public discussion: two parties Alice and Bob and an adversary Eve have access to independent realizations of random variables X , Y , and Z, respectively, with joint distribution PXY Z . The secret key rate S(X ; Y jjZ) has been defined as the maximal rate at which Alice and Bob can generate a secret key by communication over an insecure, but authenticated channel such that Eve's information about this key is arbitrarily small. We define a new conditional mutual information measure, the intrinsic conditional mutual information, denoted by I(X ; Y#Z), and show that it is an upper bound on S(X ; Y jjZ). The special scenarios where X , Y , and Z are generated by sending a binary random variable R, for example a signal broadcast by a satellite, over independent channels, or where Z is generated by sending X and Y over erasure channels, are analyzed. In the first scenario it can be shown, even for continuous random variables, that the s...