Results 1  10
of
23
Multiparty Communication Complexity
, 1989
"... A given Boolean function has its input distributed among many parties. The aim is to determine which parties to tMk to and what information to exchange with each of them in order to evaluate the function while minimizing the total communication. This paper shows that it is possible to obtain the Boo ..."
Abstract

Cited by 621 (20 self)
 Add to MetaCart
A given Boolean function has its input distributed among many parties. The aim is to determine which parties to tMk to and what information to exchange with each of them in order to evaluate the function while minimizing the total communication. This paper shows that it is possible to obtain the Boolean answer deterministically with only a polynomial increase in communication with respect to the information lower bound given by the nondeterministic communication complexity of the function.
The NPcompleteness column: an ongoing guide
 Journal of Algorithms
, 1985
"... This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NPCompleteness,’ ’ W. H. Freeman & Co ..."
Abstract

Cited by 189 (0 self)
 Add to MetaCart
This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NPCompleteness,’ ’ W. H. Freeman & Co., New York, 1979 (hereinafter referred to as ‘‘[G&J]’’; previous columns will be referred to by their dates). A background equivalent to that provided by [G&J] is assumed, and, when appropriate, crossreferences will be given to that book and the list of problems (NPcomplete and harder) presented there. Readers who have results they would like mentioned (NPhardness, PSPACEhardness, polynomialtimesolvability, etc.) or open problems they would like publicized, should
On Data Structures and Asymmetric Communication Complexity
 JOURNAL OF COMPUTER AND SYSTEM SCIENCES
, 1994
"... In this paper we consider two party communication complexity when the input sizes of the two players differ significantly, the "asymmetric" case. Most of previous work on communication complexity only considers the total number of bits sent, but we study tradeoffs between the number of bits the ..."
Abstract

Cited by 84 (9 self)
 Add to MetaCart
In this paper we consider two party communication complexity when the input sizes of the two players differ significantly, the "asymmetric" case. Most of previous work on communication complexity only considers the total number of bits sent, but we study tradeoffs between the number of bits the first player sends and the number of bits the second sends. These
Interactive Communication of Balanced Distributions and of Correlated Files
, 1993
"... (X; Y ) is a pair of random variables distributed over a support set S. Person PX knows X, Person P Y knows Y , and both know S. Using a predetermined protocol, they exchange binary messages in order for P Y to learn X. PX may or may not learn Y . The mmessage complexity, Cm , is the number of ..."
Abstract

Cited by 40 (1 self)
 Add to MetaCart
(X; Y ) is a pair of random variables distributed over a support set S. Person PX knows X, Person P Y knows Y , and both know S. Using a predetermined protocol, they exchange binary messages in order for P Y to learn X. PX may or may not learn Y . The mmessage complexity, Cm , is the number of information bits that must be transmitted (by both persons) in the worst case if only m messages are allowed. C1 is the number of bits required when there is no restriction on the number of messages exchanged. We consider a natural class of random pairs. ¯ is the maximum number of X values possible with a given Y value. j is the maximum number of Y values possible with a given X value. The random pair (X; Y ) is balanced if ¯ = j. The following hold for all balanced random pairs. Oneway communication requires at most twice the minimum number of bits: C 1 2 C1 + 1. This bound is almost tight: for every ff, there is a balanced random pair for which C 1 2 C1 \Gamma 6 ff. Three...
Interaction in Quantum Communication and the Complexity of Set Disjointness
, 2001
"... One of the most intriguing facts about communication using quantum states is that these states cannot be used to transmit more classical bits than the number of qubits used, yet in some scenarios there are ways of conveying information with much fewer, even exponentially fewer, qubits than possible ..."
Abstract

Cited by 35 (8 self)
 Add to MetaCart
One of the most intriguing facts about communication using quantum states is that these states cannot be used to transmit more classical bits than the number of qubits used, yet in some scenarios there are ways of conveying information with much fewer, even exponentially fewer, qubits than possible classically [1], [2], [3]. Moreover, some of these methods have a very simple structurethey involve only few message exchanges between the communicating parties. We consider the question as to whether every classical protocol may be transformed to a \simpler" quantum protocolone that has similar eciency, but uses fewer message exchanges.
WorstCase Interactive Communication I: Two Messages are Almost Optimal
 IEEE Transactions on Information Theory
, 1990
"... X and Y are random variables. Person PX knows X, Person P Y knows Y , and both know the joint probability distribution of the pair (X; Y ). Using a predetermined protocol, they communicate over a binary, errorfree, channel in order for P Y to learn X. PX may or may not learn Y . How many informatio ..."
Abstract

Cited by 33 (6 self)
 Add to MetaCart
X and Y are random variables. Person PX knows X, Person P Y knows Y , and both know the joint probability distribution of the pair (X; Y ). Using a predetermined protocol, they communicate over a binary, errorfree, channel in order for P Y to learn X. PX may or may not learn Y . How many information bits must be transmitted (by both persons) in the worst case if only m messages are allowed? C 1 (XjY ) is the number of bits required when at most one message is allowed, necessarily from PX to P Y . C 2 (XjY ) is the number of bits required when at most two messages are permitted: P Y transmits a message to PX , then PX responds with a message to P Y . C1 (XjY ) is the number of bits required when communication is unrestricted: PX and P Y can communicate back and forth. The maximum reduction in communication achievable via interaction is almost logarithmic. For all (X; Y ) pairs, C1 (XjY ) dlog C 1 (XjY )e + 1, whereas, for a class of (X; Y ) pairs, C1 (XjY ) = dlog C 1 (...
Lower Bounds for Oneway Probabilistic Communication Complexity
, 1992
"... this paper can be generalized to the optimal model? 8 Acknowledgment I wish to thank L. Hemachandra for his invitation to me to spend the spring semester at the University of Rochester and for his permanent attention to my research and helpfulness in all my problems and J. Seiferas for extensive c ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
this paper can be generalized to the optimal model? 8 Acknowledgment I wish to thank L. Hemachandra for his invitation to me to spend the spring semester at the University of Rochester and for his permanent attention to my research and helpfulness in all my problems and J. Seiferas for extensive comments on an earlier draft of this paper. The results of section 4.1 of the paper are the realization of J. Seiferas's advice to investigate the probabilistic complexity properties of almost all functions in comparison with Yao's [Y1] results. I wish also to thank P. Dietz for his comments, which helped to simplify the proof of lemma 4.1
The Communication Complexity of Threshold Gates
 In Proceedings of “Combinatorics, Paul Erdos is Eighty
, 1994
"... We prove upper bounds on the randomized communication complexity of evaluating a threshold gate (with arbitrary weights). For linear threshold gates this is done in the usual 2 party communication model, and for degreed threshold gates this is done in the multiparty model. We then use these upp ..."
Abstract

Cited by 29 (1 self)
 Add to MetaCart
We prove upper bounds on the randomized communication complexity of evaluating a threshold gate (with arbitrary weights). For linear threshold gates this is done in the usual 2 party communication model, and for degreed threshold gates this is done in the multiparty model. We then use these upper bounds together with known lower bounds for communication complexity in order to give very easy proofs for lower bounds in various models of computation involving threshold gates. This generalizes several known bounds and answers several open problems.
Tight lower bounds for query processing on streaming and external memory data
 ICALP
, 2005
"... Abstract. We study a clean machine model for external memory and stream processing. We show that the number of scans of the external data induces a strict hierarchy (as long as work space is sufficiently small, e.g., polylogarithmic in the size of the input). We also show that neither joins nor sort ..."
Abstract

Cited by 26 (12 self)
 Add to MetaCart
Abstract. We study a clean machine model for external memory and stream processing. We show that the number of scans of the external data induces a strict hierarchy (as long as work space is sufficiently small, e.g., polylogarithmic in the size of the input). We also show that neither joins nor sorting are feasible if the product of the number r(n) of scans of the external memory and the size s(n) of the internal memory buffers is sufficiently small, e.g., of size o ( 5 √ n). We also establish tight bounds for the complexity of XPath evaluation and filtering. 1
WorstCase Interactive Communication II: Two Messages are Not Optimal
 IEEE Transactions on Information Theory
, 1991
"... X and Y are random variables. Person PX knows X, Person P Y knows Y , and both know the joint probability distribution of the pair (X; Y ). Using a predetermined protocol, they communicate over a binary, errorfree, channel in order for P Y to learn X. PX may or may not learn Y . How many informatio ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
X and Y are random variables. Person PX knows X, Person P Y knows Y , and both know the joint probability distribution of the pair (X; Y ). Using a predetermined protocol, they communicate over a binary, errorfree, channel in order for P Y to learn X. PX may or may not learn Y . How many information bits must be transmitted (by both persons) in the worst case if only m messages are allowed? C 1 (XjY ) is the number of bits required when at most one message is allowed, necessarily from PX to P Y . C 2 (XjY ) is the number of bits required when at most two messages are permitted: P Y transmits a message to PX , then PX responds with a message to P Y . C1 (XjY ) is the number of bits required when communication is unrestricted: PX and P Y can communicate back and forth. It is known that onemessage communication may require exponentially more bits than the minimum necessary: for some (X; Y ) pairs, C 1 (XjY ) = 2 C1 (XjY )\Gamma1 . Yet just two messages suffice to reduce com...