Results 1  10
of
41
On the Foundations of Quantitative Information Flow
"... Abstract. There is growing interest in quantitative theories of information flow in a variety of contexts, such as secure information flow, anonymity protocols, and sidechannel analysis. Such theories offer an attractive way to relax the standard noninterference properties, letting us tolerate “sma ..."
Abstract

Cited by 47 (6 self)
 Add to MetaCart
Abstract. There is growing interest in quantitative theories of information flow in a variety of contexts, such as secure information flow, anonymity protocols, and sidechannel analysis. Such theories offer an attractive way to relax the standard noninterference properties, letting us tolerate “small ” leaks that are necessary in practice. The emerging consensus is that quantitative information flow should be founded on the concepts of Shannon entropy and mutual information.Butauseful theory of quantitative information flow must provide appropriate security guarantees: if the theory says that an attack leaks x bits of secret information, then x should be useful in calculating bounds on the resulting threat. In this paper, we focus on the threat that an attack will allow the secret to be guessed correctly in one try. With respect to this threat model, we argue that the consensus definitions actually fail to give good security guarantees—the problem is that a random variable can have arbitrarily large Shannon entropy even if it is highly vulnerable to being guessed. We then explore an alternative foundation based on a concept of vulnerability (closely related to Bayes risk) and which measures uncertainty using Rényi’s minentropy, rather than Shannon entropy. 1
Automatic discovery and quantification of information leaks
 IN: IEEE SYMPOSIUM ON SECURITY AND PRIVACY
, 2009
"... Informationflow analysis is a powerful technique for reasoning about the sensitive information exposed by a program during its execution. We present the first automatic method for informationflow analysis that discovers what information is leaked and computes its comprehensive quantitative interpr ..."
Abstract

Cited by 38 (3 self)
 Add to MetaCart
Informationflow analysis is a powerful technique for reasoning about the sensitive information exposed by a program during its execution. We present the first automatic method for informationflow analysis that discovers what information is leaked and computes its comprehensive quantitative interpretation. The leaked information is characterized by an equivalence relation on secret artifacts, and is represented by a logical assertion over the corresponding program variables. Our measurement procedure computes the number of discovered equivalence classes and their sizes. This provides a basis for computing a set of quantitative properties, which includes all established informationtheoretic measures in quantitative informationflow. Our method exploits an inherent connection between formal models of qualitative informationflow and program verification techniques. We provide an implementation of our method that builds upon existing tools for program verification and informationtheoretic analysis. Our experimental evaluation indicates the practical applicability of the presented method.
Informationtheoretic bounds for differentially private mechanisms
 In 24rd IEEE Computer Security Foundations Symposium, CSF 2011. IEEE Computer Society, Los Alamitos
"... Abstract—There are two active and independent lines of research that aim at quantifying the amount of information that is disclosed by computing on confidential data. Each line of research has developed its own notion of confidentiality: on the one hand, differential privacy is the emerging consensu ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
Abstract—There are two active and independent lines of research that aim at quantifying the amount of information that is disclosed by computing on confidential data. Each line of research has developed its own notion of confidentiality: on the one hand, differential privacy is the emerging consensus guarantee used for privacypreserving data analysis. On the other hand, informationtheoretic notions of leakage are used for characterizing the confidentiality properties of programs in languagebased settings. The purpose of this article is to establish formal connections between both notions of confidentiality, and to compare them in terms of the security guarantees they deliver. We obtain the following results. First, we establish upper bounds for the leakage of every ɛdifferentially private mechanism in terms of ɛ and the size of the mechanism’s input domain. We achieve this by identifying and leveraging a connection to coding theory. Second, we construct a class of ɛdifferentially private channels whose leakage grows with the size of their input domains. Using these channels, we show that there cannot be domainsizeindependent bounds for the leakage of all ɛdifferentially private mechanisms. Moreover, we perform an empirical evaluation that shows that the leakage of these channels almost matches our theoretical upper bounds, demonstrating the accuracy of these bounds. Finally, we show that the question of providing optimal upper bounds for the leakage of ɛdifferentially private mechanisms in terms of rational functions of ɛ is in fact decidable.
P.: Quantifying information leaks in software
 In: Proc. ACSAC ’10
, 2010
"... Leakage of confidential information represents a serious security risk. Despite a number of novel, theoretical advances, it has been unclear if and how quantitative approaches to measuring leakage of confidential information could be applied to substantial, realworld programs. This is mostly due to ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Leakage of confidential information represents a serious security risk. Despite a number of novel, theoretical advances, it has been unclear if and how quantitative approaches to measuring leakage of confidential information could be applied to substantial, realworld programs. This is mostly due to the high complexity of computing precise leakage quantities. In this paper, we introduce a technique which makes it possible to decide if a program conforms to a quantitative policy which scales to large statespaces with the help of bounded model checking. Our technique is applied to a number of officially reported information leak vulnerabilities in the Linux Kernel. Additionally, we also analysed authentication routines in the Secure Remote Password suite and of a Internet Message Support Protocol implementation. Our technique shows when there is unacceptable leakage; the same technique is also used to verify, for the first time, that the applied software patches indeed plug the information leaks. This is the first demonstration of quantitative information flow addressing security concerns of realworld industrial programs.
Quantification of Integrity
"... Two informationflow integrity measures are introduced: contamination and suppression. The former is dual to informationflow confidentiality, and the latter is analogous to the standard model of channel reliability from information theory. The relationship between quantitative integrity, confidenti ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Two informationflow integrity measures are introduced: contamination and suppression. The former is dual to informationflow confidentiality, and the latter is analogous to the standard model of channel reliability from information theory. The relationship between quantitative integrity, confidentiality, and database privacy is examined.
Statistical Measurement of Information Leakage
"... Abstract. Information theory provides a range of useful methods to analyse probability distributions and these techniques have been successfully applied to measure information flow and the loss of anonymity in secure systems. However, previous work has tended to assume that the exact probabilities o ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
Abstract. Information theory provides a range of useful methods to analyse probability distributions and these techniques have been successfully applied to measure information flow and the loss of anonymity in secure systems. However, previous work has tended to assume that the exact probabilities of every action are known, or that the system is nondeterministic. In this paper, we show that measures of information leakage based on mutual information and capacity can be calculated, automatically, from trial runs of a system alone. We find a confidence interval for this estimate based on the number of possible inputs, observations and samples. We have developed a tool to automatically perform this analysis and we demonstrate our method by analysing a Mixminon anonymous remailer node. 1
The Effects of
 Artificial Sources of Water on Rangeland Biodiversity. Environment Australia and CSIRO
, 1997
"... “Turing hoped that his abstractedpapertape model was so simple, so transparent and well defined, that it would not depend on any assumptions about physics that could conceivably be falsified, and therefore that it could become the basis of an abstract theory of computation that was independent of ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
“Turing hoped that his abstractedpapertape model was so simple, so transparent and well defined, that it would not depend on any assumptions about physics that could conceivably be falsified, and therefore that it could become the basis of an abstract theory of computation that was independent of the underlying physics. ‘He thought, ’ as Feynman once put it, ‘that he understood paper. ’ But he was mistaken. Real, quantummechanical paper is wildly different from the abstract stuff that the Turing machine uses. The Turing machine is entirely classical...”
Quantitative Information Flow – Verification Hardness and Possibilities
"... Abstract—Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inf ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Abstract—Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inferring quantitative information flow according to such definitions. We prove that, even for just comparing two programs on which has the larger flow, none of the definitions is a ksafety property for any k, and therefore is not amenable to the selfcomposition technique that has been successfully applied to precisely checking noninterference. We also show a complexity theoretic gap with noninterference by proving that, for loopfree boolean programs whose noninterference is coNPcomplete, the comparison problem is #Phard for all of the definitions. For positive results, we show that universally quantifying the distribution in the comparison problem, that is, comparing two programs according to the entropy based definitions on which has the larger flow for all distributions, is a 2safety problem in general and is coNPcomplete when restricted for loopfree boolean programs. We prove this by showing that the problem is equivalent to a simple relation naturally expressing the fact that one program is more secure than the other. We prove that the relation also refines the channelcapacity based definition, and that it can be precisely checked via the selfcomposition as well as the “interleaved ” selfcomposition technique. I.
P.: Applied quantitative information flow and statistical databases
 In: Proc. of the Int. Workshop on Formal Aspects in Security and Trust. Volume 5983 of LNCS., Springer (2009) 96–110 inria00580122, version 5  30 Sep 2011
"... Abstract We firstly describe an algebraic structure which serves as solid basis to quantitatively reason about information flows. We demonstrate how programs in form of partition of states fit into that theoretical framework. The paper presents a new method and implementation to automatically calcul ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Abstract We firstly describe an algebraic structure which serves as solid basis to quantitatively reason about information flows. We demonstrate how programs in form of partition of states fit into that theoretical framework. The paper presents a new method and implementation to automatically calculate such partitions, and compares it to existing approaches. As a novel application, we describe a way to transform database queries into a suitable program form which then can be statically analysed to measure its leakage and to spot database inference threats. 1
Verified Indifferentiable Hashing into Elliptic Curves
"... Abstract. Many cryptographic systems based on elliptic curves are proven secure in the Random Oracle Model, assuming there exist probabilistic functions that map elements in some domain (e.g. bitstrings) onto uniformly and independently distributed points in a curve. When implementing such systems, ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Abstract. Many cryptographic systems based on elliptic curves are proven secure in the Random Oracle Model, assuming there exist probabilistic functions that map elements in some domain (e.g. bitstrings) onto uniformly and independently distributed points in a curve. When implementing such systems, and in order for the proof to carry over to the implementation, those mappings must be instantiated with concrete constructions whose behavior does not deviate significantly from random oracles. In contrast to other approaches to publickey cryptography, where candidates to instantiate random oracles have been known for some time, the first generic construction for hashing into ordinary elliptic curves indifferentiable from a random oracle was put forward only recently by Brier et al. We present a machinechecked proof of this construction. The proof is based on an extension of the CertiCrypt framework with logics and mechanized tools for reasoning about approximate forms of observational equivalence, and integrates mathematical libraries of group theory and elliptic curves. 1