Results 1  10
of
10
Quantification of Integrity
"... Two informationflow integrity measures are introduced: contamination and suppression. The former is dual to informationflow confidentiality, and the latter is analogous to the standard model of channel reliability from information theory. The relationship between quantitative integrity, confidenti ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
Two informationflow integrity measures are introduced: contamination and suppression. The former is dual to informationflow confidentiality, and the latter is analogous to the standard model of channel reliability from information theory. The relationship between quantitative integrity, confidentiality, and database privacy is examined.
Differential Privacy: on the tradeoff between Utility and Information Leakage ⋆
, 2011
"... Abstract. Differential privacy is a notion of privacy that has become very popular in the database community. Roughly, the idea is that a randomized query mechanism provides sufficient privacy protection if the ratio between the probabilities that two adjacent datasets give the same answer is bound ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Differential privacy is a notion of privacy that has become very popular in the database community. Roughly, the idea is that a randomized query mechanism provides sufficient privacy protection if the ratio between the probabilities that two adjacent datasets give the same answer is bound by e ǫ. In the field of information flow there is a similar concern for controlling information leakage, i.e. limiting the possibility of inferring the secret information from the observables. In recent years, researchers have proposed to quantify the leakage in terms of minentropy leakage, a concept strictly related to the Bayes risk. In this paper, we show how to model the query system in terms of an informationtheoretic channel, and we compare the notion of differential privacy with that of minentropy leakage. We show that differential privacy implies a bound on the minentropy leakage, but not viceversa. Furthermore, we show that our bound is tight. Then, we consider the utility of the randomization mechanism, which represents how close the randomized answers are to the real ones, in average. We show that the notion of differential privacy implies a bound on utility, also tight, and we propose a method that under certain conditions builds an optimal randomization mechanism, i.e. a mechanism which provides the best utility while guaranteeing ǫdifferential privacy. 1
On the relation between Differential Privacy and Quantitative Information Flow ⋆
, 2011
"... Abstract. Differential privacy is a notion that has emerged in the community of statistical databases, as a response to the problem of protecting the privacy of the database’s participants when performing statistical queries. The idea is that a randomized query satisfies differential privacy if the ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Differential privacy is a notion that has emerged in the community of statistical databases, as a response to the problem of protecting the privacy of the database’s participants when performing statistical queries. The idea is that a randomized query satisfies differential privacy if the likelihood of obtaining a certain answer for a database x is not too different from the likelihood of obtaining the same answer on adjacent databases, i.e. databases which differ from x for only one individual. Information flow is an area of Security concerned with the problem of controlling the leakage of confidential information in programs and protocols. Nowadays, one of the most established approaches to quantify and to reason about leakage is based on the Rényi min entropy version of information theory. In this paper, we analyze critically the notion of differential privacy in light of the conceptual framework provided by the Rényi min information theory. We show that there is a close relation between differential privacy and leakage, due to the graph symmetries induced by the adjacency relation. Furthermore, we consider the utility of the randomized answer, which measures its expected degree of accuracy. We focus on certain kinds of utility functions called “binary”, which have a close correspondence with the Rényi min mutual information. Again, it turns out that there can be a tight correspondence between differential privacy and utility, depending on the symmetries induced by the adjacency relation and by the query. Depending on these symmetries we can also build an optimalutility randomization mechanism while preserving the required level of differential privacy. Our main contribution is a study of the kind of structures that can be induced by the adjacency relation and the query, and how to use them to derive bounds on the leakage and achieve the optimal utility. 1
Compositional Methods for InformationHiding
"... Protocols for informationhiding often use randomized primitives to obfuscate the link between the observables and the information to be protected. The degree of protection provided by a protocol can be expressed in terms of the probability of error associated to the inference of the secret inform ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Protocols for informationhiding often use randomized primitives to obfuscate the link between the observables and the information to be protected. The degree of protection provided by a protocol can be expressed in terms of the probability of error associated to the inference of the secret information. We consider a probabilistic process calculus approach to the specification of such protocols, and we study how the operators affect the probability of error. In particular, we characterize constructs that have the property of not decreasing the degree of protection, and that can therefore be considered safe in the modular construction of protocols. As a case study, we apply these techniques to the Dining Cryptographers, and we are able to derive a generalization of Chaum’s strong anonymity result.
Precise Quantitative Information Flow Analysis Using Symbolic Model Counting
"... Abstract. Quantitative information flow analyses (QIF) are a class of techniques for measuring the amount of confidential information leaked by a program to its public outputs. QIF analyses can be approximative or precise, offering different tradeoffs. In this paper, we lift a particular limitation ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Quantitative information flow analyses (QIF) are a class of techniques for measuring the amount of confidential information leaked by a program to its public outputs. QIF analyses can be approximative or precise, offering different tradeoffs. In this paper, we lift a particular limitation of precise QIF. We show how symbolic model counting replaces explicit leak enumeration with symbolic computation, thus eliminating the associated bottleneck. 1
SATbased Analysis and Quantification of Information Flow in Programs
"... Abstract. Quantitative information flow analysis (QIF) is a portfolio of security techniques quantifying the flow of confidential information to public ports. In this paper, we advance the state of the art in QIF for imperative programs. We present both an abstract formulation of the analysis in ter ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Quantitative information flow analysis (QIF) is a portfolio of security techniques quantifying the flow of confidential information to public ports. In this paper, we advance the state of the art in QIF for imperative programs. We present both an abstract formulation of the analysis in terms of verification condition generation, logical projection and model counting, and an efficient concrete implementation targeting ANSI C programs. The implementation combines various novel and existing SATbased tools for bounded model checking, #SAT solving in presence of projection, and SAT preprocessing. We evaluate the technique on synthetic and semirealistic benchmarks. 1
Quantitative Information Flow and applications to Differential Privacy
, 2011
"... Secure information flow is the problem of ensuring that the information made publicly available by a computational system does not leak information that should be kept secret. Since it is practically impossible to avoid leakage entirely, in recent years there has been a growing interest in consider ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Secure information flow is the problem of ensuring that the information made publicly available by a computational system does not leak information that should be kept secret. Since it is practically impossible to avoid leakage entirely, in recent years there has been a growing interest in considering the quantitative aspects of information flow, in order to measure and compare the amount of leakage. Information theory is widely regarded as a natural framework to provide firm foundations to quantitive information flow. In this notes we review the two main informationtheoretic approaches that have been investigated: the one based on Shannon entropy, and the one based on Rényi minentropy. Furthermore, we discuss some applications in the area of privacy. In particular, we consider statistical databases and the recentlyproposed notion of differential privacy. Using the informationtheoretic view, we discuss the bound that differential privacy induces on leakage, and the tradeoff between utility and privacy.
Contractual Date of Delivery to the CEC: 1Apr2013 Actual Date of Delivery to the CEC: 20Mar2013 Organisation name of lead contractor for this deliverable: INR
, 2013
"... The concept of differential privacy emerged as an approach to protect the privacy of the individuals participating in statistical databases. Roughly, a mechanism satisfies differential privacy if the presence or value of a single individual in a database does not change significantly the likelihood ..."
Abstract
 Add to MetaCart
(Show Context)
The concept of differential privacy emerged as an approach to protect the privacy of the individuals participating in statistical databases. Roughly, a mechanism satisfies differential privacy if the presence or value of a single individual in a database does not change significantly the likelihood of obtaining a certain answer to any statistical query posed by the data analyst. Differentiallyprivate mechanisms are often oblivious: first the query is processed on the database to produce the true answer, and then this answer is adequately randomized before being reported to the data analyst. Ideally the mechanism should minimize leakage, i.e., obfuscate as much as possible the link between the reported answer and the individuals ’ data. At the same time, it should maximize utility, i.e., the reported answer should be as close as possible to the true one. These two goals are, however, conflicting, and a tradeoff between privacy and utility is imposed. In this paper we use quantitative information flow to analyze the leakage and the utility of oblivious differentiallyprivate mechanisms. We introduce a technique that exploits some graphsymmetries presented by the adjacency relation on databases to derive bounds on the leakage of the mechanism, measured as minentropy leakage. We use identity gain functions, which are closely related to minentropy leakage, to evaluate utility, and therefore we are also able to derive bounds for it. Depending on the graphsymmetries we consider, we can additionally build a mechanism that maximizes utility while preserving the required level of differential privacy.
From Qualitative to Quantitative Information Erasure
"... Abstract. We define a quantitative measure of information erasure as a dual of the wellunderstood notion of quantitative information release. Our journey begins from a qualitative, equivalence relationsbased, definition of information erasure and release, which we show to be tightly linked to the ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. We define a quantitative measure of information erasure as a dual of the wellunderstood notion of quantitative information release. Our journey begins from a qualitative, equivalence relationsbased, definition of information erasure and release, which we show to be tightly linked to the quantitative measures of these notions. In particular, given the necessary probability distribution over the inputs of a deterministic system, we show that the quantitative measures of erasure and release are directly derivable from the equivalence relationsbased definitions. However, we observe that the quantitative definitions, unlike the qualitative ones, are less expressive and may suffer from practical problems such as erasure and release occlusion – a problem, which at its core is attributable to the symmetry of the informationtheoretic entropy definition. 1 Information Erasure and Release There is often a need to erase information in real systems. In particular, a system that processes confidential data may be expected to remove pieces of sensitive information from the body of information that it propagates. For example, statistical databases may
To cite this version:
, 2015
"... HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."
Abstract
 Add to MetaCart
(Show Context)
HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés. On the information leakage of differentiallyprivate mechanisms