Results 1  10
of
38
Coding for Computing
 IEEE Transactions on Information Theory
, 1998
"... A sender communicates with a receiver who wishes to reliably evaluate a function of their combined data. We show that if only the sender can transmit, the number of bits required is a conditional entropy of a naturally defined graph. We also determine the number of bits needed when the communicators ..."
Abstract

Cited by 70 (0 self)
 Add to MetaCart
A sender communicates with a receiver who wishes to reliably evaluate a function of their combined data. We show that if only the sender can transmit, the number of bits required is a conditional entropy of a naturally defined graph. We also determine the number of bits needed when the communicators exchange two messages. 1 Introduction Let f be a function of two random variables X and Y . A sender PX knows X, a receiver PY knows Y , and both want PY to reliably determine f(X; Y ). How many bits must PX transmit? Embedding this communicationcomplexity scenario (Yao [22]) in the standard informationtheoretic setting (Shannon [17]), we assume that (1) f(X; Y ) must be determined for a block of many independent (X; Y )instances, (2) PX transmits after observing the whole block of X instances, (3) a vanishing block error probability is allowed, and (4) the problem's rate L f (XjY ) is the number of bits transmitted for the block, normalized by the number of instances. Two simple bou...
Visual Cryptography for General Access Structures
, 1996
"... A visual cryptography scheme for a set P of n participants is a method to encode a secret image SI into n shadow images called shares, where each participant in P receives one share. Certain qualified subsets of participants can "visually" recover the secret image, but other, forbidden, sets of part ..."
Abstract

Cited by 70 (8 self)
 Add to MetaCart
A visual cryptography scheme for a set P of n participants is a method to encode a secret image SI into n shadow images called shares, where each participant in P receives one share. Certain qualified subsets of participants can "visually" recover the secret image, but other, forbidden, sets of participants have no information (in an informationtheoretic sense) on SI . A "visual" recovery for a set X ` P consists of xeroxing the shares given to the participants in X onto transparencies, and then stacking them. The participants in a qualified set X will be able to see the secret image without any knowledge of Cryptography and without performing any cryptographic computation. In this paper we propose two techniques to construct visual cryptography schemes for general access structures. We analyze the structure of visual cryptography schemes and we prove bounds on the size of the shares distributed to the participants in the scheme. We provide a novel technique to realize k out of n thre...
Derandomization, witnesses for Boolean matrix multiplication and construction of perfect hash functions
 Algorithmica
, 1996
"... Small sample spaces with almost independent random variables are applied to design efficient sequential deterministic algorithms for two problems. The first algorithm, motivated by the attempt to design efficient algorithms for the All Pairs Shortest Path problem using fast matrix multiplication, so ..."
Abstract

Cited by 62 (6 self)
 Add to MetaCart
Small sample spaces with almost independent random variables are applied to design efficient sequential deterministic algorithms for two problems. The first algorithm, motivated by the attempt to design efficient algorithms for the All Pairs Shortest Path problem using fast matrix multiplication, solves the problem of computing witnesses for the Boolean product of two matrices. That is, if A and B are two n by n matrices, and C = AB is their Boolean product, the algorithm finds for every entry Cij = 1 a witness: an index k so that Aik = Bkj = 1. Its running time exceeds that of computing the product of two n by n matrices with small integer entries by a polylogarithmic factor. The second algorithm is a nearly linear time deterministic procedure for constructing a perfect hash function for a given nsubset of {1,..., m}.
On Some Methods for Unconditionally Secure Key Distribution and Broadcast Encryption
 Designs, Codes and Cryptography
, 1996
"... This paper provides an exposition of methods by which a trusted authority can distribute keys and/or broadcast a message over a network, so that each member of a privileged subset of users can compute a specified key or decrypt the broadcast message. Moreover, this is done in such a way that no coal ..."
Abstract

Cited by 50 (8 self)
 Add to MetaCart
This paper provides an exposition of methods by which a trusted authority can distribute keys and/or broadcast a message over a network, so that each member of a privileged subset of users can compute a specified key or decrypt the broadcast message. Moreover, this is done in such a way that no coalition is able to recover any information on a key or broadcast message they are not supposed to know. The problems are studied using the tools of information theory, so the security provided is unconditional (i.e., not based on any computational assumption). We begin by surveying some useful schemes schemes for key distribution that have been presented in the literature, giving background and examples (but not too many proofs). In particular, we look more closely at the attractive concept of key distribution patterns, and present a new method for making these schemes more efficient through the use of resilient functions. Then we present a general approach to the construction of broadcast sch...
Secure Frameproof Codes, Key Distribution Patterns, Group Testing Algorithms and Related Structures
 Journal of Statistical Planning and Inference
, 1997
"... Frameproof codes were introduced by Boneh and Shaw as a method of "digital fingerprinting" which prevents a coalition of a specified size c from framing a user not in the coalition. Stinson and Wei then gave a combinatorial formulation of the problem in terms of certain types of extremal set sytems. ..."
Abstract

Cited by 46 (11 self)
 Add to MetaCart
Frameproof codes were introduced by Boneh and Shaw as a method of "digital fingerprinting" which prevents a coalition of a specified size c from framing a user not in the coalition. Stinson and Wei then gave a combinatorial formulation of the problem in terms of certain types of extremal set sytems. In this paper, we study frameproof codes that provide a certain (weak) form of traceability. We extend our combinatorial formulation to address this stronger requirement, and show that the problem is solved by using (i; j)separating systems, as defined by Friedman, Graham and Ullman. Using constructions based on perfect hash families, we give the first efficient explicit constructions for these objects for general values of i and j. We also review nonconstructive existence results that are based on probabilistic arguments. Then we look at two other, related concepts, namely key distribution patterns and nonadaptive group testing algorithms. We again approach these problems from the point...
Interactive Communication of Balanced Distributions and of Correlated Files
, 1993
"... (X; Y ) is a pair of random variables distributed over a support set S. Person PX knows X, Person P Y knows Y , and both know S. Using a predetermined protocol, they exchange binary messages in order for P Y to learn X. PX may or may not learn Y . The mmessage complexity, Cm , is the number of ..."
Abstract

Cited by 40 (1 self)
 Add to MetaCart
(X; Y ) is a pair of random variables distributed over a support set S. Person PX knows X, Person P Y knows Y , and both know S. Using a predetermined protocol, they exchange binary messages in order for P Y to learn X. PX may or may not learn Y . The mmessage complexity, Cm , is the number of information bits that must be transmitted (by both persons) in the worst case if only m messages are allowed. C1 is the number of bits required when there is no restriction on the number of messages exchanged. We consider a natural class of random pairs. ¯ is the maximum number of X values possible with a given Y value. j is the maximum number of Y values possible with a given X value. The random pair (X; Y ) is balanced if ¯ = j. The following hold for all balanced random pairs. Oneway communication requires at most twice the minimum number of bits: C 1 2 C1 + 1. This bound is almost tight: for every ff, there is a balanced random pair for which C 1 2 C1 \Gamma 6 ff. Three...
Splitters and NearOptimal Derandomization
, 1995
"... We present a fairly general method for finding deterministic constructions obeying what we call k restrictions; this yields structures of size not much larger than the probabilistic bound. The structures constructed by our method include (n; k)universal sets (a collection of binary vectors of leng ..."
Abstract

Cited by 39 (2 self)
 Add to MetaCart
We present a fairly general method for finding deterministic constructions obeying what we call k restrictions; this yields structures of size not much larger than the probabilistic bound. The structures constructed by our method include (n; k)universal sets (a collection of binary vectors of length n such that for any subset of size k of the indices, all 2 configurations appear) and families of perfect hash functions. The nearoptimal constructions of these objects imply the very efficient derandomization of algorithms in learning, of fixedsubgraph finding algorithms, and of near optimal \Sigma\Pi\Sigma threshold formulae. In addition, they derandomize the reduction showing the hardness of approximation of set cover. They also yield deterministic constructions for a localcoloring protocol, and for exhaustive testing of circuits.
Source Coding and Graph Entropies
 IEEE Trans. Inform. Theory
, 1995
"... A sender wants to accurately convey information to a receiver who has some, possibly related, data. We study the expected number of bits the sender must transmit for one and for multiple instances in two communication scenarios and relate this number to the chromatic and Korner entropies of a natura ..."
Abstract

Cited by 38 (0 self)
 Add to MetaCart
A sender wants to accurately convey information to a receiver who has some, possibly related, data. We study the expected number of bits the sender must transmit for one and for multiple instances in two communication scenarios and relate this number to the chromatic and Korner entropies of a naturally defined graph. 1 Introduction We study the expected number of bits a sender must transmit to convey information to a receiver who has some, possibly related, data. We consider single and multipleinstances of two related scenarios. This section describes the two scenarios and the results obtained. We begin with the familiar, standard sourcecoding scenario, dubbed restricted inputs because the inputs are restricted to belong to a given support set. 1.1 Restricted inputs (X; Y ) is a pair of random variables distributed over a countable product set X \Theta Y according to a probability distribution p(x; y). A sender PX knows X while a receiver P Y knows Y and wants to learn X without e...
WorstCase Interactive Communication I: Two Messages are Almost Optimal
 IEEE Transactions on Information Theory
, 1990
"... X and Y are random variables. Person PX knows X, Person P Y knows Y , and both know the joint probability distribution of the pair (X; Y ). Using a predetermined protocol, they communicate over a binary, errorfree, channel in order for P Y to learn X. PX may or may not learn Y . How many informatio ..."
Abstract

Cited by 34 (6 self)
 Add to MetaCart
X and Y are random variables. Person PX knows X, Person P Y knows Y , and both know the joint probability distribution of the pair (X; Y ). Using a predetermined protocol, they communicate over a binary, errorfree, channel in order for P Y to learn X. PX may or may not learn Y . How many information bits must be transmitted (by both persons) in the worst case if only m messages are allowed? C 1 (XjY ) is the number of bits required when at most one message is allowed, necessarily from PX to P Y . C 2 (XjY ) is the number of bits required when at most two messages are permitted: P Y transmits a message to PX , then PX responds with a message to P Y . C1 (XjY ) is the number of bits required when communication is unrestricted: PX and P Y can communicate back and forth. The maximum reduction in communication achievable via interaction is almost logarithmic. For all (X; Y ) pairs, C1 (XjY ) dlog C 1 (XjY )e + 1, whereas, for a class of (X; Y ) pairs, C1 (XjY ) = dlog C 1 (...
A hypergraph approach to the identifying parent property: the case of multiple parents
 SIAM J. Disc. Math
, 2001
"... GREGORY KABATIANSKY ¶ , AND GILLES ZÉMOR � Abstract. Let C be a code of length n over analphabet of q letters. An nword y is called a descendant of a set of t codewords x1,...,xt if yi ∈{x1 i,...,xt i} for all i =1,...,n. A code is said to have the tidentifying parent property if for any nword th ..."
Abstract

Cited by 31 (5 self)
 Add to MetaCart
GREGORY KABATIANSKY ¶ , AND GILLES ZÉMOR � Abstract. Let C be a code of length n over analphabet of q letters. An nword y is called a descendant of a set of t codewords x1,...,xt if yi ∈{x1 i,...,xt i} for all i =1,...,n. A code is said to have the tidentifying parent property if for any nword that is a descendant of at most t parents it is possible to identify at least one of them. We prove that for any t ≤ q − 1 there exist sequences of such codes with asymptotically nonvanishing rate. Key words. Helly property, errorcorrecting codes, identifying parent property AMS subject classifications. 94A60, 05C65 PII. S0895480100376848