Results 1  10
of
458
Generalized privacy amplification
 IEEE Transactions on Information Theory
, 1995
"... Abstract This paper provides a general treatment of privacy amplification by public discussion, a concept introduced by Bennett, Brassard, and Robert for a special scenario. Privacy amplification is a process that allows two parties to distill a secret key from a common random variable about which ..."
Abstract

Cited by 271 (19 self)
 Add to MetaCart
Abstract This paper provides a general treatment of privacy amplification by public discussion, a concept introduced by Bennett, Brassard, and Robert for a special scenario. Privacy amplification is a process that allows two parties to distill a secret key from a common random variable about which an eavesdropper has partial information. The two parties generally know nothing about the eavesdropper’s information except that it satisfies a certain constraint. The results have applications to unconditionally secure secretkey agreement protocols and quantum cryptography, and they yield results on wiretap and broadcast channels for a considerably strengthened definition of secrecy capacity. Index Terms Cryptography, secretkey agreement, unconditional security, privacy amplification, wiretap channel, secrecy capacity, RCnyi entropy, universal hashing, quantum cryptography. I.
An Information Statistics Approach to Data Stream and Communication Complexity
, 2003
"... We present a new method for proving strong lower bounds in communication complexity. ..."
Abstract

Cited by 185 (8 self)
 Add to MetaCart
We present a new method for proving strong lower bounds in communication complexity.
On the Foundations of Quantitative Information Flow
"... Abstract. There is growing interest in quantitative theories of information flow in a variety of contexts, such as secure information flow, anonymity protocols, and sidechannel analysis. Such theories offer an attractive way to relax the standard noninterference properties, letting us tolerate “sma ..."
Abstract

Cited by 97 (10 self)
 Add to MetaCart
(Show Context)
Abstract. There is growing interest in quantitative theories of information flow in a variety of contexts, such as secure information flow, anonymity protocols, and sidechannel analysis. Such theories offer an attractive way to relax the standard noninterference properties, letting us tolerate “small ” leaks that are necessary in practice. The emerging consensus is that quantitative information flow should be founded on the concepts of Shannon entropy and mutual information.Butauseful theory of quantitative information flow must provide appropriate security guarantees: if the theory says that an attack leaks x bits of secret information, then x should be useful in calculating bounds on the resulting threat. In this paper, we focus on the threat that an attack will allow the secret to be guessed correctly in one try. With respect to this threat model, we argue that the consensus definitions actually fail to give good security guarantees—the problem is that a random variable can have arbitrarily large Shannon entropy even if it is highly vulnerable to being guessed. We then explore an alternative foundation based on a concept of vulnerability (closely related to Bayes risk) and which measures uncertainty using Rényi’s minentropy, rather than Shannon entropy. 1
A CorrelationBased Approach to Robust Point Set Registration
 In ECCV
, 2004
"... Abstract. Correlation is a very effective way to align intensity images. We extend the correlation technique to point set registration using a method we call kernel correlation. Kernel correlation is an affinity measure, and it is also a function of the point set entropy. We define the point set reg ..."
Abstract

Cited by 55 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Correlation is a very effective way to align intensity images. We extend the correlation technique to point set registration using a method we call kernel correlation. Kernel correlation is an affinity measure, and it is also a function of the point set entropy. We define the point set registration problem as finding the maximum kernel correlation configuration of the the two point sets to be registered. The new registration method has intuitive interpretations, simple to implement algorithm and easy to prove convergence property. Our method shows favorable performance when compared with the iterative closest point (ICP) and EMICP methods. 1
Asymptotic Theory of Greedy Approximations to Minimal KPoint Random Graphs
"... Let Xn = fx 1 ; : : : ; xn g, be an i.i.d. sample having multivariate distribution P . We derive a.s. limits for the power weighted edge weight function of greedy approximations to a class of minimal graphs spanning k of the n samples. The class includes minimal kpoint graphs constructed by the p ..."
Abstract

Cited by 50 (18 self)
 Add to MetaCart
Let Xn = fx 1 ; : : : ; xn g, be an i.i.d. sample having multivariate distribution P . We derive a.s. limits for the power weighted edge weight function of greedy approximations to a class of minimal graphs spanning k of the n samples. The class includes minimal kpoint graphs constructed by the partitioning method of Ravi, Sundaram, Marathe, Rosenkrantz and Ravi [43] where the edge weight function satises the quasiadditive property of Redmond and Yukich [45]. In particular this includes greedy approximations to the kpoint minimal spanning tree (kMST), Steiner tree (kST), and the traveling salesman problem (kTSP). An expression for the inuence function of the minimal weight function is given which characterizes the asymptotic sensitivity of the graph weight to perturbations in the underlying distribution. The inuence function takes a form which indicates that the kpoint minimal graph in d > 1 dimensions has robustness properties in IR d which are analogous to those of rank order statistics in one dimension. A direct result of our theory is that the logweight of the kpoint minimal graph is a consistent nonparametric estimate of the Renyi entropy of the distribution P . Possible applications of this work include: analysis of random communication network topologies, estimation of the mixing coecient in contaminated mixture models, outlier discrimination and rejection, clustering and pattern recognition, robust nonparametric regression, two sample matching and image registration.
Unconditional Security Against MemoryBounded Adversaries
 In Advances in Cryptology – CRYPTO ’97, Lecture Notes in Computer Science
, 1997
"... We propose a privatekey cryptosystem and a protocol for key agreement by public discussion that are unconditionally secure based on the sole assumption that an adversary's memory capacity is limited. No assumption about her computing power is made. The scenario assumes that a random bit string ..."
Abstract

Cited by 47 (3 self)
 Add to MetaCart
(Show Context)
We propose a privatekey cryptosystem and a protocol for key agreement by public discussion that are unconditionally secure based on the sole assumption that an adversary's memory capacity is limited. No assumption about her computing power is made. The scenario assumes that a random bit string of length slightly larger than the adversary's memory capacity can be received by all parties. The random bit string can for instance be broadcast by a satellite or over an optical network, or transmitted over an insecure channel between the communicating parties. The proposed schemes require very high bandwidth but can nevertheless be practical. 1 Introduction One of the most important properties of a cryptographic system is a proof of its security under reasonable and general assumptions. However, every design involves a tradeoff between the strength of the security and further important qualities of a cryptosystem, such as efficiency and practicality. The security of all currently used cryp...
Mutual Information, Metric Entropy, and Cumulative Relative Entropy Risk
 Annals of Statistics
, 1996
"... Assume fP ` : ` 2 \Thetag is a set of probability distributions with a common dominating measure on a complete separable metric space Y . A state ` 2 \Theta is chosen by Nature. A statistician gets n independent observations Y 1 ; : : : ; Y n from Y distributed according to P ` . For each time ..."
Abstract

Cited by 46 (2 self)
 Add to MetaCart
Assume fP ` : ` 2 \Thetag is a set of probability distributions with a common dominating measure on a complete separable metric space Y . A state ` 2 \Theta is chosen by Nature. A statistician gets n independent observations Y 1 ; : : : ; Y n from Y distributed according to P ` . For each time t between 1 and n, based on the observations Y 1 ; : : : ; Y t\Gamma1 , the statistician produces an estimated distribution P t for P ` , and suffers a loss L(P ` ; P t ). The cumulative risk for the statistician is the average total loss up to time n. Of special interest in information theory, data compression, mathematical finance, computational learning theory and statistical mechanics is the special case when the loss L(P ` ; P t ) is the relative entropy between the true distribution P ` and the estimated distribution P t . Here the cumulative Bayes risk from time 1 to n is the mutual information between the random parameter \Theta and the observations Y 1 ; : : : ;...
AlphaDivergence for Classification, Indexing and Retrieval
 UNIVERSITY OF MICHIGAN
, 2001
"... Motivated by Chernoff's bound on asymptotic probability of error we propose the alphadivergence measure and a surrogate, the alphaJensen difference, for feature classification, indexing and retrieval in image and other databases. The alpha ..."
Abstract

Cited by 45 (4 self)
 Add to MetaCart
Motivated by Chernoff's bound on asymptotic probability of error we propose the alphadivergence measure and a surrogate, the alphaJensen difference, for feature classification, indexing and retrieval in image and other databases. The alpha
Simple and tight bounds for information reconciliation and privacy amplification
 In Advances in Cryptology—ASIACRYPT 2005, Lecture Notes in Computer Science
, 2005
"... Abstract. Shannon entropy is a useful and important measure in information processing, for instance, data compression or randomness extraction, under the assumption—which can typically safely be made in communication theory—that a certain random experiment is independently repeated many times. In cr ..."
Abstract

Cited by 43 (6 self)
 Add to MetaCart
(Show Context)
Abstract. Shannon entropy is a useful and important measure in information processing, for instance, data compression or randomness extraction, under the assumption—which can typically safely be made in communication theory—that a certain random experiment is independently repeated many times. In cryptography, however, where a system’s working has to be proven with respect to a malicious adversary, this assumption usually translates to a restriction on the latter’s knowledge or behavior and is generally not satisfied. An example is quantum key agreement, where the adversary can attack each particle sent through the quantum channel differently or even carry out coherent attacks, combining a number of particles together. In informationtheoretic key agreement, the central functionalities of information reconciliation and privacy amplification have, therefore, been extensively studied in the scenario of general distributions: Partial solutions have been given, but the obtained bounds are arbitrarily far from tight, and a full analysis appeared
A Generalized Divergence Measure for Robust Image Registration
 IEEE Transactions on Signal Processing
, 2003
"... Entropybased divergence measures have shown promising results in many areas of engineering and image processing. In this paper, we define a new generalized divergence measure, namely, the JensenRnyi divergence. Some properties such as convexity and its upper bound are derived. Based on the Jensen ..."
Abstract

Cited by 43 (4 self)
 Add to MetaCart
(Show Context)
Entropybased divergence measures have shown promising results in many areas of engineering and image processing. In this paper, we define a new generalized divergence measure, namely, the JensenRnyi divergence. Some properties such as convexity and its upper bound are derived. Based on the JensenRnyi divergence, we propose a new approach to the problem of image registration. Some appealing advantages of registration by JensenRnyi divergence are illustrated, and its connections to mutual informationbased registration techniques are analyzed. As the key focus of this paper, we apply JensenRnyi divergence for inverse synthetic aperture radar (ISAR) image registration. The goal is to estimate the target motion during the imaging time. Our approach applies JensenRnyi divergence to measure the statistical dependence between consecutive ISAR image frames, which would be maximal if the images are geometrically aligned. Simulation results demonstrate that the proposed method is efficient and effective.