Results 1  10
of
59
Interactive information complexity
 In Proceedings of the 44th annual ACM Symposium on Theory of Computing, STOC ’12
, 2012
"... The primary goal of this paper is to define and study the interactive information complexity of functions. Let f(x, y) be a function, and suppose Alice is given x and Bob is given y. Informally, the interactive information complexity IC(f) of f is the least amount of information Alice and Bob need t ..."
Abstract

Cited by 35 (5 self)
 Add to MetaCart
The primary goal of this paper is to define and study the interactive information complexity of functions. Let f(x, y) be a function, and suppose Alice is given x and Bob is given y. Informally, the interactive information complexity IC(f) of f is the least amount of information Alice and Bob need to reveal to each other to compute f. Previously, information complexity has been defined with respect to a prior distribution on the input pairs (x, y). Our first goal is to give a definition that is independent of the prior distribution. We show that several possible definitions are essentially equivalent. We establish some basic properties of the interactive information complexity IC(f). In particular, we show that IC(f) is equal to the amortized (randomized) communication complexity of f. We also show a direct sum theorem for IC(f) and give the first general connection between information complexity and (nonamortized) communication complexity. We explore the information complexity of two specific problems – Equality and Disjointness. We conclude with a list of open problems and research directions.
Variations on the Sensitivity Conjecture
 THEORY OF COMPUTING LIBRARY GRADUATE SURVEYS 4 (2011), PP. 1–27
, 2011
"... The sensitivity of a Boolean function f of n Boolean variables is the maximum over all inputs x of the number of positions i such that flipping the ith bit of x changes the value of f (x). Permitting to flip disjoint blocks of bits leads to the notion of block sensitivity, known to be polynomially ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
The sensitivity of a Boolean function f of n Boolean variables is the maximum over all inputs x of the number of positions i such that flipping the ith bit of x changes the value of f (x). Permitting to flip disjoint blocks of bits leads to the notion of block sensitivity, known to be polynomially related to a number of other complexity measures of f, including the decisiontree complexity, the polynomial degree, and the certificate complexity. A longstanding open question is whether sensitivity also belongs to this equivalence class. A positive answer to this question is known as the Sensitivity Conjecture. We present a selection of known as well as new variants of the Sensitivity Conjecture and point out some weaker versions that are also open. Among other things, we relate the problem to Communication Complexity via recent results by Sherstov (QIC 2010). We also indicate new connections to Fourier analysis.
Calibree ⋆ : Calibrationfree Localization using Relative Distance Estimations
"... Abstract. Existing localization algorithms, such as centroid or fingerprinting, compute the location of a mobile device based on measurements of signal strengths from radio base stations. Unfortunately, these algorithms require tedious and expensive offline calibration in the target deployment area ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Abstract. Existing localization algorithms, such as centroid or fingerprinting, compute the location of a mobile device based on measurements of signal strengths from radio base stations. Unfortunately, these algorithms require tedious and expensive offline calibration in the target deployment area before they can be used for localization. In this paper, we present Calibree, a novel localization algorithm that does not require offline calibration. The algorithm starts by computing relative distances between pairs of mobile phones based on signatures of their radio environment. It then combines these distances with the known locations of a small number of GPSequipped phones to estimate absolute locations of all phones, effectively spreading location measurements from phones with GPS to those without. Our evaluation results show that Calibree performs better than the conventional centroid algorithm and only slightly worse than fingerprinting, without requiring offline calibration. Moreover, when no phones report their absolute locations, Calibree can be used to estimate relative distances between phones. 1
TelAviv University, Israel Thesis: “Deriving Specialized Heap Analyses for Verifying ComponentClient Conformance”
"... RESEARCH INTERESTS I am interested in developing software infrastructure solutions for mobile and ubiquitous computing applications. My current ..."
Abstract
 Add to MetaCart
RESEARCH INTERESTS I am interested in developing software infrastructure solutions for mobile and ubiquitous computing applications. My current
On the Relative Merits of Simple Local Search Methods for the Max Sat Problem
"... Abstract. Algorithms based on local search are popular for solving many optimization problems including the maximum satisfiability problem (MAXSAT). With regard to MAXSAT, the state of the art in performance for universal (i.e. non specialized solvers) seems to be variants of Simulated Annealing (SA ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. Algorithms based on local search are popular for solving many optimization problems including the maximum satisfiability problem (MAXSAT). With regard to MAXSAT, the state of the art in performance for universal (i.e. non specialized solvers) seems to be variants of Simulated Annealing (SA) and MaxWalkSat (MWS), stochastic local search methods. Local search methods are conceptually simple, and they often provide near optimal solutions. In contrast, it is relatively rare that local search algorithms are analyzed with respect to the worstcase approximation ratios. In the first part of the paper, we build on Mastrolilli and Gambardella’s work [14] and present a worstcase analysis of tabu search for the MAXkSAT problem. In the second part of the paper, we examine the experimental performance of determinstic local search algorithms (oblivious and nonoblivious local search, tabu search) in comparison to stochastic methods (SA and MWS) on random 3CNF and random kCNF formulas and on benchmarks from MAXSAT competitions. For random Max3SAT, tabu search consistently outperforms both oblivious and nonoblivious local search, but does not match the performance of SA and MWS. Initializing with nonoblivous local search improves both the performance and the running time of tabu search. The better performance of the various methods that escape local optima in comparison to the more basic oblivious and nonoblivious local search algorithms (that stop at the first local optimum encountered) comes at a cost, namely a significant increase in complexity (which we measure in terms of variable flips). The performance results observed for the unweighted MAX3SAT problem carry over to the weighted version of the problem, but now the better performance of MWS is more pronounced. In contrast, as we consider MaxkSat as k is increased, MWS loses its advantage. Finally, on benchmark instances, it appears that simulated annealing and tabu search initialized with non oblivious local search outperform the other methods on most instances. 1
Towards a Reverse Newman’s Theorem in Interactive Information Complexity
, 2013
"... Newman’s theorem states that we can take any publiccoin communication protocol and convert it into one that uses only private randomness with only a little increase in communication complexity. We consider a reversed scenario in the context of information complexity: can we take a protocol that use ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Newman’s theorem states that we can take any publiccoin communication protocol and convert it into one that uses only private randomness with only a little increase in communication complexity. We consider a reversed scenario in the context of information complexity: can we take a protocol that uses private randomness and convert it into one that only uses public randomness while preserving the information revealed to each player? We prove that the answer is yes, at least for protocols that use a bounded number of rounds. As an application, we prove new direct sum theorems through the compression of interactive communication in the boundedround setting. Furthermore, we show that if a Reverse Newman’s Theorem can be proven in full generality, then full compression of interactive communication and fullygeneral directsum theorems will result.
Interactive information and coding theory
"... Abstract. We give a highlevel overview of recent developments in interactive information and coding theory. These include developments involving interactive noiseless coding and interactive errorcorrection. The overview is primarily focused on developments related to complexitytheoretic applicat ..."
Abstract
 Add to MetaCart
Abstract. We give a highlevel overview of recent developments in interactive information and coding theory. These include developments involving interactive noiseless coding and interactive errorcorrection. The overview is primarily focused on developments related to complexitytheoretic applications, although the broader context and agenda are also set out. As the present paper is an extended abstract, the vast majority of proofs and technical details are omitted, and can be found in the respective publications and preprints.
2016 President, First Vice President, and Second Vice President. The Board chose
"... It is the middle of July as I sit down to write this column. With one school year over and another not yet begun, it is a good time to reflect on recent events and look forward to those to come. The Board of Governors held its annual meeting in early June. Nominations made at that meeting led to ele ..."
Abstract
 Add to MetaCart
It is the middle of July as I sit down to write this column. With one school year over and another not yet begun, it is a good time to reflect on recent events and look forward to those to come. The Board of Governors held its annual meeting in early June. Nominations made at that meeting led to elections for the
Quantum Computing
"... 1 Classical and Quantum computation: circuit model 3 1.1 Reversible Computation..................... 3 1.2 Probabilistic Computation.................... 7 ..."
Abstract
 Add to MetaCart
1 Classical and Quantum computation: circuit model 3 1.1 Reversible Computation..................... 3 1.2 Probabilistic Computation.................... 7
Results 1  10
of
59