Results 1  10
of
282
Generalized Privacy Amplification
 IEEE Transactions on Information Theory
, 1995
"... This paper provides a general treatment of privacy amplification by public discussion, a concept introduced by Bennett, Brassard and Robert [1] for a special scenario. The results have applications to unconditionallysecure secretkey agreement protocols, quantum cryptography and to a nonasymptotic ..."
Abstract

Cited by 212 (18 self)
 Add to MetaCart
This paper provides a general treatment of privacy amplification by public discussion, a concept introduced by Bennett, Brassard and Robert [1] for a special scenario. The results have applications to unconditionallysecure secretkey agreement protocols, quantum cryptography and to a nonasymptotic and constructive treatment of the secrecy capacity of wiretap and broadcast channels, even for a considerably strengthened definition of secrecy capacity. I. Introduction This paper is concerned with unconditionallysecure secretkey agreement by two communicating parties Alice and Bob who both know a random variable W, for instance a random nbit string, about which an eavesdropper Eve has incomplete information characterized by the random variable V jointly distributed with W according to PV W . This distribution may partially be under Eve's control. Alice and Bob know nothing about PV W , except that it satisfies a certain constraint. We present protocols by which Alice and Bob can us...
An Information Statistics Approach to Data Stream and Communication Complexity
, 2003
"... We present a new method for proving strong lower bounds in communication complexity. ..."
Abstract

Cited by 156 (8 self)
 Add to MetaCart
We present a new method for proving strong lower bounds in communication complexity.
On the Foundations of Quantitative Information Flow
"... Abstract. There is growing interest in quantitative theories of information flow in a variety of contexts, such as secure information flow, anonymity protocols, and sidechannel analysis. Such theories offer an attractive way to relax the standard noninterference properties, letting us tolerate “sma ..."
Abstract

Cited by 51 (6 self)
 Add to MetaCart
Abstract. There is growing interest in quantitative theories of information flow in a variety of contexts, such as secure information flow, anonymity protocols, and sidechannel analysis. Such theories offer an attractive way to relax the standard noninterference properties, letting us tolerate “small ” leaks that are necessary in practice. The emerging consensus is that quantitative information flow should be founded on the concepts of Shannon entropy and mutual information.Butauseful theory of quantitative information flow must provide appropriate security guarantees: if the theory says that an attack leaks x bits of secret information, then x should be useful in calculating bounds on the resulting threat. In this paper, we focus on the threat that an attack will allow the secret to be guessed correctly in one try. With respect to this threat model, we argue that the consensus definitions actually fail to give good security guarantees—the problem is that a random variable can have arbitrarily large Shannon entropy even if it is highly vulnerable to being guessed. We then explore an alternative foundation based on a concept of vulnerability (closely related to Bayes risk) and which measures uncertainty using Rényi’s minentropy, rather than Shannon entropy. 1
AlphaDivergence for Classification, Indexing and Retrieval
 UNIVERSITY OF MICHIGAN
, 2001
"... Motivated by Chernoff's bound on asymptotic probability of error we propose the alphadivergence measure and a surrogate, the alphaJensen difference, for feature classification, indexing and retrieval in image and other databases. The alpha ..."
Abstract

Cited by 45 (4 self)
 Add to MetaCart
Motivated by Chernoff's bound on asymptotic probability of error we propose the alphadivergence measure and a surrogate, the alphaJensen difference, for feature classification, indexing and retrieval in image and other databases. The alpha
Asymptotic Theory of Greedy Approximations to Minimal KPoint Random Graphs
"... Let Xn = fx 1 ; : : : ; xn g, be an i.i.d. sample having multivariate distribution P . We derive a.s. limits for the power weighted edge weight function of greedy approximations to a class of minimal graphs spanning k of the n samples. The class includes minimal kpoint graphs constructed by the p ..."
Abstract

Cited by 42 (16 self)
 Add to MetaCart
Let Xn = fx 1 ; : : : ; xn g, be an i.i.d. sample having multivariate distribution P . We derive a.s. limits for the power weighted edge weight function of greedy approximations to a class of minimal graphs spanning k of the n samples. The class includes minimal kpoint graphs constructed by the partitioning method of Ravi, Sundaram, Marathe, Rosenkrantz and Ravi [43] where the edge weight function satises the quasiadditive property of Redmond and Yukich [45]. In particular this includes greedy approximations to the kpoint minimal spanning tree (kMST), Steiner tree (kST), and the traveling salesman problem (kTSP). An expression for the inuence function of the minimal weight function is given which characterizes the asymptotic sensitivity of the graph weight to perturbations in the underlying distribution. The inuence function takes a form which indicates that the kpoint minimal graph in d > 1 dimensions has robustness properties in IR d which are analogous to those of rank order statistics in one dimension. A direct result of our theory is that the logweight of the kpoint minimal graph is a consistent nonparametric estimate of the Renyi entropy of the distribution P . Possible applications of this work include: analysis of random communication network topologies, estimation of the mixing coecient in contaminated mixture models, outlier discrimination and rejection, clustering and pattern recognition, robust nonparametric regression, two sample matching and image registration.
Unconditional Security Against MemoryBounded Adversaries
 In Advances in Cryptology – CRYPTO ’97, Lecture Notes in Computer Science
, 1997
"... We propose a privatekey cryptosystem and a protocol for key agreement by public discussion that are unconditionally secure based on the sole assumption that an adversary's memory capacity is limited. No assumption about her computing power is made. The scenario assumes that a random bit string of l ..."
Abstract

Cited by 40 (3 self)
 Add to MetaCart
We propose a privatekey cryptosystem and a protocol for key agreement by public discussion that are unconditionally secure based on the sole assumption that an adversary's memory capacity is limited. No assumption about her computing power is made. The scenario assumes that a random bit string of length slightly larger than the adversary's memory capacity can be received by all parties. The random bit string can for instance be broadcast by a satellite or over an optical network, or transmitted over an insecure channel between the communicating parties. The proposed schemes require very high bandwidth but can nevertheless be practical. 1 Introduction One of the most important properties of a cryptographic system is a proof of its security under reasonable and general assumptions. However, every design involves a tradeoff between the strength of the security and further important qualities of a cryptosystem, such as efficiency and practicality. The security of all currently used cryp...
Mutual Information, Metric Entropy, and Cumulative Relative Entropy Risk
 Annals of Statistics
, 1996
"... Assume fP ` : ` 2 \Thetag is a set of probability distributions with a common dominating measure on a complete separable metric space Y . A state ` 2 \Theta is chosen by Nature. A statistician gets n independent observations Y 1 ; : : : ; Y n from Y distributed according to P ` . For each time ..."
Abstract

Cited by 40 (2 self)
 Add to MetaCart
Assume fP ` : ` 2 \Thetag is a set of probability distributions with a common dominating measure on a complete separable metric space Y . A state ` 2 \Theta is chosen by Nature. A statistician gets n independent observations Y 1 ; : : : ; Y n from Y distributed according to P ` . For each time t between 1 and n, based on the observations Y 1 ; : : : ; Y t\Gamma1 , the statistician produces an estimated distribution P t for P ` , and suffers a loss L(P ` ; P t ). The cumulative risk for the statistician is the average total loss up to time n. Of special interest in information theory, data compression, mathematical finance, computational learning theory and statistical mechanics is the special case when the loss L(P ` ; P t ) is the relative entropy between the true distribution P ` and the estimated distribution P t . Here the cumulative Bayes risk from time 1 to n is the mutual information between the random parameter \Theta and the observations Y 1 ; : : : ;...
A class of Rényi information estimators for multidimensional densities
 Annals of Statistics
, 2008
"... A class of estimators of the Rényi and Tsallis entropies of an unknown distribution f in R m is presented. These estimators are based on the kth nearestneighbor distances computed from a sample of N i.i.d. vectors with distribution f. We show that entropies of any order q, including Shannon’s entro ..."
Abstract

Cited by 34 (3 self)
 Add to MetaCart
A class of estimators of the Rényi and Tsallis entropies of an unknown distribution f in R m is presented. These estimators are based on the kth nearestneighbor distances computed from a sample of N i.i.d. vectors with distribution f. We show that entropies of any order q, including Shannon’s entropy, can be estimated consistently with minimal assumptions on f.Moreover, we show that it is straightforward to extend the nearestneighbor method to estimate the statistical distance between two distributions using one i.i.d. sample from each.
Universally Composable Privacy Amplification against Quantum Adversaries
, 2004
"... Privacy amplification is the art of shrinking a partially secret string Z to a highly secret key S. We introduce a universally composable security definition for secret keys in a context where an adversary holds quantum information and show that privacy amplification by twouniversal hashing is secu ..."
Abstract

Cited by 33 (5 self)
 Add to MetaCart
Privacy amplification is the art of shrinking a partially secret string Z to a highly secret key S. We introduce a universally composable security definition for secret keys in a context where an adversary holds quantum information and show that privacy amplification by twouniversal hashing is secure with respect to this definition. Additionally, we give an asymptotically optimal lower bound on the length of the extractable key S in terms of the adversary's (quantum) knowledge about Z.
A Generalized Divergence Measure for Robust Image Registration
 IEEE Transactions on Signal Processing
, 2003
"... Entropybased divergence measures have shown promising results in many areas of engineering and image processing. In this paper, we define a new generalized divergence measure, namely, the JensenRnyi divergence. Some properties such as convexity and its upper bound are derived. Based on the Jensen ..."
Abstract

Cited by 32 (4 self)
 Add to MetaCart
Entropybased divergence measures have shown promising results in many areas of engineering and image processing. In this paper, we define a new generalized divergence measure, namely, the JensenRnyi divergence. Some properties such as convexity and its upper bound are derived. Based on the JensenRnyi divergence, we propose a new approach to the problem of image registration. Some appealing advantages of registration by JensenRnyi divergence are illustrated, and its connections to mutual informationbased registration techniques are analyzed. As the key focus of this paper, we apply JensenRnyi divergence for inverse synthetic aperture radar (ISAR) image registration. The goal is to estimate the target motion during the imaging time. Our approach applies JensenRnyi divergence to measure the statistical dependence between consecutive ISAR image frames, which would be maximal if the images are geometrically aligned. Simulation results demonstrate that the proposed method is efficient and effective.