Results 1 
7 of
7
Statistical Measurement of Information Leakage
"... Abstract. Information theory provides a range of useful methods to analyse probability distributions and these techniques have been successfully applied to measure information flow and the loss of anonymity in secure systems. However, previous work has tended to assume that the exact probabilities o ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
(Show Context)
Abstract. Information theory provides a range of useful methods to analyse probability distributions and these techniques have been successfully applied to measure information flow and the loss of anonymity in secure systems. However, previous work has tended to assume that the exact probabilities of every action are known, or that the system is nondeterministic. In this paper, we show that measures of information leakage based on mutual information and capacity can be calculated, automatically, from trial runs of a system alone. We find a confidence interval for this estimate based on the number of possible inputs, observations and samples. We have developed a tool to automatically perform this analysis and we demonstrate our method by analysing a Mixminon anonymous remailer node. 1
Quantifying information leakage of randomized protocols
, 2013
"... Abstract. The quantification of information leakage provides a quantitative evaluation of the security of a system. We propose the usage of Markovian processes to model and analyze the information leakage of deterministic and probabilistic systems. We show that this method generalizes the lattice o ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
Abstract. The quantification of information leakage provides a quantitative evaluation of the security of a system. We propose the usage of Markovian processes to model and analyze the information leakage of deterministic and probabilistic systems. We show that this method generalizes the lattice of information approach and is a natural framework for modeling refined attackers capable to observe the internal behavior of the system. We also use our method to obtain an algorithm for the computation of channel capacity from our Markovian models. Finally, we show how to use the method to analyze timed and nontimed attacks on the Onion Routing protocol. 1
The Optimum Leakage Principle for Analyzing Multithreaded Programs
"... Abstract. Bellman’s optimality principle is a method for solving problems where one needs to find best decisions one after another. The principle can be extended to assess the information leakage in multithreaded programs, and is formalized into the optimum leakage principle hereby proposed in this ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Bellman’s optimality principle is a method for solving problems where one needs to find best decisions one after another. The principle can be extended to assess the information leakage in multithreaded programs, and is formalized into the optimum leakage principle hereby proposed in this paper. By modeling the state transitions in multithreaded programs, the principle is combined with information theory to assess the leakage in multithreaded programs, as the result of an optimal policy. This offers a new perspective to measure the information leakage and enables to track the leakage at runtime. Examples are given to demonstrate the analysis process. Finally, efficient implementation of this methodology is also briefly discussed. 1
Maximizing entropy over Markov processes
, 2013
"... Abstract. The channel capacity of a deterministic system with confidential data is an upper bound on the amount of bits of data an attacker can learn from the system. We encode all possible attacks to a system using a probabilistic specification, an Interval Markov Chain. Then the channel capacity c ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The channel capacity of a deterministic system with confidential data is an upper bound on the amount of bits of data an attacker can learn from the system. We encode all possible attacks to a system using a probabilistic specification, an Interval Markov Chain. Then the channel capacity computation reduces to finding a model of a specification with highest entropy. Entropy maximization for probabilistic process specifications has not been studied before, even though it is well known in Bayesian inference for discrete distributions. We give a characterization of global entropy of a process as a reward function, a polynomial algorithm to verify the existence of an system maximizing entropy among those respecting a specification, a procedure for the maximization of reward functions over Interval Markov Chains and its application to synthesize an implementation maximizing entropy. We show how to use Interval Markov Chains to model abstractions of deterministic systems with confidential data, and use the above results to compute their channel capacity. These results are a foundation for ongoing work on computing channel capacity for abstractions of programs derived from code. 1
The Thermodynamics of Confidentiality
"... Abstract—This work, of a foundational nature, establishes a connection between secure computation and the 2nd principle of thermodynamics. In particular we show that any deterministic computation, where the final state of the system is observable, must dissipate at least WKBT ln 2. Here W is the in ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract—This work, of a foundational nature, establishes a connection between secure computation and the 2nd principle of thermodynamics. In particular we show that any deterministic computation, where the final state of the system is observable, must dissipate at least WKBT ln 2. Here W is the information theoretic notion of remaining uncertainty as defined in Quantitative Information Flow, KB the Boltzmann constant and T the system temperature. By contrast, for probabilistic computations thermodynamic work can be extracted from secure systems: in this case, again using information theoretic results, we provide bounds on the amount of work that can be extracted. Further we show that in deterministic systems the dissipated energy is an upper bound on Smith’s remaining vulnerability; by doing so we provide the first thermodynamic interpretation of guessability. Crucially, unlike much literature on the physics of computation, our focus is not a universal model but a software field of great practical relevance, namely security. We see this work as a genuine scientific advance with the potential to enhance the understanding of both confidentiality and dissipative systems in physics.
Under consideration for publication in Math. Struct. in Comp. Science Quantification of Integrity †
, 2011
"... Three integrity measures are introduced: contamination, channel suppression, and program suppression. Contamination is a measure of how much untrusted information reaches trusted outputs; it is the dual of leakage, which is a measure of informationflow confidentiality. Channel suppression is a meas ..."
Abstract
 Add to MetaCart
(Show Context)
Three integrity measures are introduced: contamination, channel suppression, and program suppression. Contamination is a measure of how much untrusted information reaches trusted outputs; it is the dual of leakage, which is a measure of informationflow confidentiality. Channel suppression is a measure of how much information about inputs to a noisy channel is missing from channel outputs. And program suppression is a measure of how much information about the correct output of a program is lost because of attacker influence and implementation errors. Program and channel suppression do not have confidentiality duals. As a case study, the relationship between quantitative integrity, confidentiality, and database privacy is examined. 1.
Creative Commons Attribution License. Studying Maximum Information Leakage Using Karush–Kuhn–Tucker Conditions
"... When studying the information leakage in programs or protocols, a natural question arises: “what is the worst case scenario?”. This problem of identifying the maximal leakage can be seen as a channel capacity problem in the information theoretical sense. In this paper, by combining two powerful theo ..."
Abstract
 Add to MetaCart
(Show Context)
When studying the information leakage in programs or protocols, a natural question arises: “what is the worst case scenario?”. This problem of identifying the maximal leakage can be seen as a channel capacity problem in the information theoretical sense. In this paper, by combining two powerful theories: Information Theory and Karush–Kuhn–Tucker conditions, we demonstrate a very general solution to the channel capacity problem. Examples are given to show how our solution can be applied to practical contexts of programs and anonymity protocols, and how this solution generalizes previous approaches to this problem. 1