Results 1  10
of
37
Dimensions and Principles of Declassification
, 2005
"... Computing systems often deliberately release (or declassify) sensitive information. A principal security concern for systems permitting information release is whether this release is safe: is it possible that the attacker compromises the information release mechanism and extracts more secret informa ..."
Abstract

Cited by 113 (14 self)
 Add to MetaCart
Computing systems often deliberately release (or declassify) sensitive information. A principal security concern for systems permitting information release is whether this release is safe: is it possible that the attacker compromises the information release mechanism and extracts more secret information than intended? While the security community has recognised the importance of the problem, the stateoftheart in information release is, unfortunately, a number of approaches with somewhat unconnected semantic goals. We provide a road map of the main directions of current research, by classifying the basic goals according to what information is released, who releases information, where in the system information is released, and when information can be released. With a general declassification framework as a longterm goal, we identify some prudent principles of declassification. These principles shed light on existing definitions and may also serve as useful "sanity checks" for emerging models.
Declassification: Dimensions and principles
 IN PROCEEDINGS OF THE 18TH IEEE WORKSHOP ON COMPUTER SECURITY FOUNDATIONS (CSFW’05
, 2005
"... Computing systems often deliberately release (or declassify) sensitive information. A principal security concern for systems permitting information release is whether this release is safe: is it possible that the attacker compromises the information release mechanism and extracts more secret inform ..."
Abstract

Cited by 35 (9 self)
 Add to MetaCart
(Show Context)
Computing systems often deliberately release (or declassify) sensitive information. A principal security concern for systems permitting information release is whether this release is safe: is it possible that the attacker compromises the information release mechanism and extracts more secret information than intended? While the security community has recognised the importance of the problem, the stateoftheart in information release is, unfortunately, a number of approaches with somewhat unconnected semantic goals. We provide a road map of the main directions of current research, by classifying the basic goals according to what information is released, who releases information, where in the system information is released and when information can be released. With a general declassification framework as a longterm goal, we identify some prudent principles of declassification. These principles shed light on existing definitions and may also serve as useful “sanity checks” for emerging models.
Handling Encryption in an Analysis for Secure Information Flow
 In Proc. European Symp. on Programming, volume 2618 of LNCS
, 2003
"... This paper presents a program analysis for secure information flow. ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
This paper presents a program analysis for secure information flow.
Computational Soundness of Observational Equivalence
, 2008
"... Many security properties are naturally expressed as indistinguishability between two versions of a protocol. In this paper, we show that computational proofs of indistinguishability can be considerably simplified, for a class of processes that covers most existing protocols. More precisely, we show ..."
Abstract

Cited by 25 (7 self)
 Add to MetaCart
Many security properties are naturally expressed as indistinguishability between two versions of a protocol. In this paper, we show that computational proofs of indistinguishability can be considerably simplified, for a class of processes that covers most existing protocols. More precisely, we show a soundness theorem, following the line of research launched by Abadi and Rogaway in 2000: computational indistinguishability in presence of an active attacker is implied by the observational equivalence of the corresponding symbolic processes. We prove our result for symmetric encryption, but the same techniques can be applied to other security primitives such as signatures and publickey encryption. The proof requires the introduction of new concepts, which are general and can be reused in other settings.
Taskstructured probabilistic I/O automata
, 2006
"... Modeling frameworks such as Probabilistic I/O Automata (PIOA) and Markov Decision Processes permit both probabilistic and nondeterministic choices. In order to use such frameworks to express claims about probabilities of events, one needs mechanisms for resolving nondeterministic choices. For PIOAs, ..."
Abstract

Cited by 18 (12 self)
 Add to MetaCart
Modeling frameworks such as Probabilistic I/O Automata (PIOA) and Markov Decision Processes permit both probabilistic and nondeterministic choices. In order to use such frameworks to express claims about probabilities of events, one needs mechanisms for resolving nondeterministic choices. For PIOAs, nondeterministic choices have traditionally been resolved by schedulers that have perfect information about the past execution. However, such schedulers are too powerful for certain settings, such as cryptographic protocol analysis, where information must sometimes be hidden. Here, we propose a new, less powerful nondeterminismresolution mechanism for PIOAs, consisting of tasks and local schedulers. Tasks are equivalence classes of system actions that are scheduled by oblivious, global task sequences. Local schedulers resolve nondeterminism within system components, based on local information only. The resulting taskPIOA framework yields simple notions of external behavior and implementation, and supports simple compositionality results. We also define a new kind of simulation relation, and show it to be sound for proving implementation. We illustrate the potential of the taskPIOA framework by outlining its use in verifying an Oblivious Transfer protocol.
CryptographicallyMasked Flows
, 2008
"... Cryptographic operations are essential for many securitycritical systems. Reasoning about information flow in such systems is challenging because typical (noninterferencebased) informationflow definitions allow no flow from secret to public data. Unfortunately, this implies that programs with enc ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
(Show Context)
Cryptographic operations are essential for many securitycritical systems. Reasoning about information flow in such systems is challenging because typical (noninterferencebased) informationflow definitions allow no flow from secret to public data. Unfortunately, this implies that programs with encryption are ruled out because encrypted output depends on secret inputs: the plaintext and the key. However, it is desirable to allow flows arising from encryption with secret keys provided that the underlying cryptographic algorithm is strong enough. In this article we conservatively extend the noninterference definition to allow safe encryption, decryption, and key generation. To illustrate the usefulness of this approach, we propose (and implement) a type system that guarantees noninterference for a small imperative language with primitive cryptographic operations. The type system prevents dangerous program behavior (e.g., giving away a secret key or confusing keys and nonkeys), which we exemplify with secure implementations of cryptographic protocols. Because the model is based on a standard noninterference property, it allows us to develop some natural extensions. In particular, we consider publickey cryptography and integrity, which accommodate reasoning about primitives that are vulnerable to chosenciphertext attacks.
Cryptographically Sound Security Proofs for Basic And PublicKey Kerberos
 Proc. 11th European Symp. on Research. in Comp. Sec
, 2006
"... Abstract We present a computational analysis of basic Kerberos with and without its publickey extension PKINIT in which we consider authentication and key secrecy properties. Our proofs rely on the Dolev–Yaostyle model of Backes, Pfitzmann, and Waidner, which allows for mapping results obtained sym ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
Abstract We present a computational analysis of basic Kerberos with and without its publickey extension PKINIT in which we consider authentication and key secrecy properties. Our proofs rely on the Dolev–Yaostyle model of Backes, Pfitzmann, and Waidner, which allows for mapping results obtained symbolically within this model to cryptographically sound proofs if certain assumptions are met. This work was the first verification at the computational level of such a complex fragment of an industrial protocol. By considering a recently fixed version of PKINIT, we extend symbolic correctness results we previously attained in the Dolev– Yao model to cryptographically sound results in the computational model.
Controller synthesis for probabilistic systems
 In Proceedings of IFIP TCS’2004
, 2004
"... Supported by the DFGProject “VERIAM ” and the DFGNWOProject “VOSS”. Supported by the European Research Training Network “Games”. Abstract Controller synthesis addresses the question of how to limit the internal behavior of a given implementation to meet its specification, regardless of the behavi ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
(Show Context)
Supported by the DFGProject “VERIAM ” and the DFGNWOProject “VOSS”. Supported by the European Research Training Network “Games”. Abstract Controller synthesis addresses the question of how to limit the internal behavior of a given implementation to meet its specification, regardless of the behavior enforced by the environment. In this paper, we consider a model with probabilism and nondeterminism where the nondeterministic choices in some states are assumed to be controllable, while the others are under the control of an unpredictable environment. We first consider probabilistic computation tree logic as specification formalism, discuss the role of strategytypes for the controller and show the NPhardness of the controller synthesis problem. The second part of the paper presents a controller synthesis algorithm for automataspecifications which relies on a reduction to the synthesis problem for PCTL with fairness. 1.
A Probabilistic Applied Pi–Calculus
, 2007
"... We propose an extension of the Applied Pi–calculus by introducing nondeterministic and probabilistic choice operators. The semantics of the resulting model, in which probability and nondeterminism are combined, is given by Segala’s Probabilistic Automata driven by schedulers which resolve the nonde ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
We propose an extension of the Applied Pi–calculus by introducing nondeterministic and probabilistic choice operators. The semantics of the resulting model, in which probability and nondeterminism are combined, is given by Segala’s Probabilistic Automata driven by schedulers which resolve the nondeterministic choice among the probability distributions over target states. Notions of static and observational equivalence are given for the enriched calculus. In order to model the possible interaction of a process with its surrounding environment a labeled semantics is given together with a notion of weak bisimulation which is shown to coincide with the observational equivalence. Finally, we prove that results in the probabilistic framework are preserved in a purely nondeterministic setting.
Encryption Cycles and Two Views of Cryptography
 In NORDSEC 2002  Proceedings of the 7th Nordic Workshop on Secure IT Systems (Karlstad University Studies 2002:31
, 2002
"... The work by Abadi and Rogaway has started the process of bringing together the two approaches  formal and computational  to cryptography. Their work has also shown, that it is impossible to completely unify these two approaches in their typical forms  there are some principal differences in th ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
The work by Abadi and Rogaway has started the process of bringing together the two approaches  formal and computational  to cryptography. Their work has also shown, that it is impossible to completely unify these two approaches in their typical forms  there are some principal differences in their security definitions. The difference is in the security of encryption cycles. An encryption cycle is a sequence of keys, where each key is encrypted under the next one, and the last key is encrypted under the first one. In formal treatment, they are considered to be secure, but in computational treatment, insecure. In this paper we make the encryption cycles insecure in the formal model (the DolevYao model) as well, by slightly strengthening the attacker. For the modified formal model and the classical computational model, the unifying results by Abadi and Rogaway hold unconditionally.