Results 1 
4 of
4
Hiding Information in Completeness Holes New perspectives in code obfuscation and watermarking
"... In this paper we show how abstract interpretation, and more specifically completeness, provides an adequate model for reasoning about code obfuscation and watermarking. The idea is that making a program obscure, or equivalently hiding information in it, corresponds to force an interpreter (the attac ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
(Show Context)
In this paper we show how abstract interpretation, and more specifically completeness, provides an adequate model for reasoning about code obfuscation and watermarking. The idea is that making a program obscure, or equivalently hiding information in it, corresponds to force an interpreter (the attacker) to become incomplete in its attempts interpretation provides the model of the attacker (malicious host) and abstract interpretation transformers provide driving methods for understanding and designing new obfuscation and watermarking strategies: Obfuscation corresponds to make the malicious host incomplete and watermarking corresponds to hide secrets where incomplete attackers cannot extract them unless some secret key is given. 1.
Time and Probability based Information Flow Analysis
"... Abstract—In multilevel systems it is important to avoid unwanted indirect information flow from higher levels to lower levels, namely the so called covert channels. Initial studies of information flow analysis were performed by abstracting away from time and probability. It is already known that sys ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—In multilevel systems it is important to avoid unwanted indirect information flow from higher levels to lower levels, namely the so called covert channels. Initial studies of information flow analysis were performed by abstracting away from time and probability. It is already known that systems that are proved to be secure in a possibilistic framework may turn out to be insecure when time or probability are considered. Recently, work has been done in order to consider also aspects either of time or of probability, but not both. In this paper we propose a general framework, based on Probabilistic Timed Automata, where both probabilistic and timing covert channels can be studied. We define a NonInterference security property and a Non Deducibility on Composition security property, which allow expressing information flow in a timed and probabilistic setting. We then compare these properties with analogous ones defined in contexts where either time or probability or neither of them are taken into account. This permits a classification of the properties depending on their discerning power. As an application, we study a system with covert channels that we are able to discover by applying our techniques.
Behavioral Equivalences and Approximations
"... Several application domains require formal and flexible techniques for the comparison of different process models. Whenever classical equivalence checking does not provide a positive result, relaxed notions of approximation can be employed to evaluate the degree of similarity. In this extended abst ..."
Abstract
 Add to MetaCart
(Show Context)
Several application domains require formal and flexible techniques for the comparison of different process models. Whenever classical equivalence checking does not provide a positive result, relaxed notions of approximation can be employed to evaluate the degree of similarity. In this extended abstract, we first discuss the state of the art in the setting of approximate behavioral equivalences. Then, as a step towards flexibility and usability, we present a relaxation of testing equivalence taking into account three orthogonal aspects of the process observations: execution time, event probability, and observed behavior. 1 Approximations of Behavioral Equivalences Comparing process models through equivalence checking is a frequently used approach to the analysis of systems in many practical domains, ranging from modelbased verification of software implementations to the analysis of noninterference based dependability properties. However, in realworld applications perfect equiv
A Note on the Approximation of Weak Probabilistic Bisimulation
"... The need for flexible and formal approaches to the comparison of different process models is motivated in several application domains and with respect to different system properties. They can be helpful to compare a web service with some desired qualitative/quantitative service description, to rel ..."
Abstract
 Add to MetaCart
(Show Context)
The need for flexible and formal approaches to the comparison of different process models is motivated in several application domains and with respect to different system properties. They can be helpful to compare a web service with some desired qualitative/quantitative service description, to relate an implemented software architecture to a reference dependable architectural model, and to reveal the performability impact of one component over the whole system through the comparison of the two system views that are obtained by activating/deactivating the component (this is generally called noninterference analysis). As a further step towards the flexibility of equivalence checking based techniques, we advocate an approach that relies on an approximate notion of weak probabilistic bisimulation, through which to provide a measure of the approximation and diagnostic information supporting exact methods such as numerical analysis and state space minimization. Comparing different process models is a frequently used approach to the analysis of system requirements in practical application domains. In order to bridge the gap between rigid equivalence checking techniques and more relaxed distinguishability oriented requirements of real systems, in the last decade much attention has been paid on approximation methods [5,9,4,7,10,2]. This can be done in a quantitative framework where finegrain models describe, e.g., probability distributions of events or their timed behaviors. For instance, some of the proposals cited above deal with probabilistic notions of behavioral equivalence for deciding if two process models behave almost (up to small fluctuations) the same or, more formally, for measuring the distance between probabilistic transition systems. Based on this idea, one wellestablished approach uses pseudometrics, which give a measure of the similarity of systems that are not equivalent (see e.g. [7,9]). An alternative approach has been addressed in [10] in the framework of security analysis and of purely generative probabilistic systems. There, the quantifiable amount of distinguishability between process models is defined via a notion of approximate confinement, corresponding to a statistical measure of the power of the observer.