Results 1  10
of
65
Boosting and differential privacy
, 2010
"... Abstract—Boosting is a general method for improving the accuracy of learning algorithms. We use boosting to construct improved privacypreserving synopses of an input database. These are data structures that yield, for a given set Q of queries over an input database, reasonably accurate estimates of ..."
Abstract

Cited by 606 (10 self)
 Add to MetaCart
(Show Context)
Abstract—Boosting is a general method for improving the accuracy of learning algorithms. We use boosting to construct improved privacypreserving synopses of an input database. These are data structures that yield, for a given set Q of queries over an input database, reasonably accurate estimates of the responses to every query in Q, even when the number of queries is much larger than the number of rows in the database. Given a base synopsis generator that takes a distribution on Q and produces a “weak ” synopsis that yields “good ” answers for a majority of the weight in Q, our Boosting for Queries algorithm obtains a synopsis that is good for all of Q. We ensure privacy for the rows of the database, but the boosting is performed on the queries. We also provide the first synopsis generators for arbitrary sets of arbitrary lowsensitivity
Fuzzy extractors: How to generate strong keys from biometrics and other noisy data. Technical Report 2003/235, Cryptology ePrint archive, http://eprint.iacr.org, 2006. Previous version appeared at EUROCRYPT 2004
 34 [DRS07] [DS05] [EHMS00] [FJ01] Yevgeniy Dodis, Leonid Reyzin, and Adam
, 2004
"... We provide formal definitions and efficient secure techniques for • turning noisy information into keys usable for any cryptographic application, and, in particular, • reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying mater ..."
Abstract

Cited by 499 (37 self)
 Add to MetaCart
We provide formal definitions and efficient secure techniques for • turning noisy information into keys usable for any cryptographic application, and, in particular, • reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying material that, unlike traditional cryptographic keys, is (1) not reproducible precisely and (2) not distributed uniformly. We propose two primitives: a fuzzy extractor reliably extracts nearly uniform randomness R from its input; the extraction is errortolerant in the sense that R will be the same even if the input changes, as long as it remains reasonably close to the original. Thus, R can be used as a key in a cryptographic application. A secure sketch produces public information about its input w that does not reveal w, and yet allows exact recovery of w given another value that is close to w. Thus, it can be used to reliably reproduce errorprone biometric inputs without incurring the security risk inherent in storing them. We define the primitives to be both formally secure and versatile, generalizing much prior work. In addition, we provide nearly optimal constructions of both primitives for various measures of “closeness” of input data, such as Hamming distance, edit distance, and set difference.
Robust fuzzy extractors and authenticated key agreement from close secrets
 In Advances in Cryptology — Crypto 2006, volume 4117 of LNCS
, 2006
"... Consider two parties holding samples from correlated distributions W and W ′, respectively, where these samples are within distance t of each other in some metric space. The parties wish to agree on a closetouniformly distributed secret key R by sending a single message over an insecure channel co ..."
Abstract

Cited by 68 (19 self)
 Add to MetaCart
(Show Context)
Consider two parties holding samples from correlated distributions W and W ′, respectively, where these samples are within distance t of each other in some metric space. The parties wish to agree on a closetouniformly distributed secret key R by sending a single message over an insecure channel controlled by an allpowerful adversary who may read and modify anything sent over the channel. We consider both the keyless case, where the parties share no additional secret information, and the keyed case, where the parties share a longterm secret SKBSM that they can use to generate a sequence of session keys {Rj} using multiple pairs {(Wj, W ′ j)}. The former has applications to, e.g., biometric authentication, while the latter arises in, e.g., the boundedstorage model with errors. We show solutions that improve upon previous work in several respects: • The best prior solution for the keyless case with no errors (i.e., t = 0) requires the minentropy of W to exceed 2n/3, where n is the bitlength of W. Our solution applies whenever the minentropy of W exceeds the minimal threshold n/2, and yields a longer key. • Previous solutions for the keyless case in the presence of errors (i.e., t> 0) required random oracles. We give the first constructions (for certain metrics) in the standard model. • Previous solutions for the keyed case were stateful. We give the first stateless solution. 1
A.: On notions of security for deterministic encryption, and efficient constructions without random oracles. Full version of this paper
, 2008
"... Abstract. The study of deterministic publickey encryption was initiated by Bellare et al. (CRYPTO ’07), who provided the “strongest possible ” notion of security for this primitive (called PRIV) and constructions in the random oracle (RO) model. We focus on constructing efficient deterministic encr ..."
Abstract

Cited by 61 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The study of deterministic publickey encryption was initiated by Bellare et al. (CRYPTO ’07), who provided the “strongest possible ” notion of security for this primitive (called PRIV) and constructions in the random oracle (RO) model. We focus on constructing efficient deterministic encryption schemes without random oracles. To do so, we propose a slightly weaker notion of security, saying that no partial information about encrypted messages should be leaked as long as each message is apriori hardtoguess given the others (while PRIV did not have the latter restriction). Nevertheless, we argue that this version seems adequate for many practical applications. We show equivalence of this definition to singlemessage and indistinguishabilitybased ones, which are easier to work with. Then we give general constructions of both chosenplaintext (CPA) and chosenciphertextattack (CCA) secure deterministic encryption schemes, as well as efficient instantiations of them under standard numbertheoretic assumptions. Our constructions build on the recentlyintroduced framework of Peikert and Waters (STOC ’08) for constructing CCAsecure probabilistic encryption schemes, extending it to the deterministicencryption setting as well.
Rothblum: On BestPossible Obfuscation
 TCC
, 2007
"... Abstract. An obfuscator is a compiler that transforms any program (which we will view in this work as a boolean circuit) into an obfuscated program (also a circuit) that has the same inputoutput functionality as the original program, but is “unintelligible”. Obfuscation has applications for cryptog ..."
Abstract

Cited by 43 (4 self)
 Add to MetaCart
(Show Context)
Abstract. An obfuscator is a compiler that transforms any program (which we will view in this work as a boolean circuit) into an obfuscated program (also a circuit) that has the same inputoutput functionality as the original program, but is “unintelligible”. Obfuscation has applications for cryptography and for software protection. Barak et al. initiated a theoretical study of obfuscation, which focused on blackbox obfuscation, where the obfuscated circuit should leak no information except for its (blackbox) inputoutput functionality. A family of functionalities that cannot be obfuscated was demonstrated. Subsequent research has showed further negative results as well as positive results for obfuscating very specific families of circuits, all with respect to black box obfuscation. This work is a study of a new notion of obfuscation, which we call bestpossible obfuscation. Best possible obfuscation makes the relaxed requirement that the obfuscated program leaks as little information as
On cryptography with auxiliary input
 DKL09] [DS05] [FGK+ 10] [FOR12] [GHV10
, 2009
"... We study the question of designing cryptographic schemes which are secure even if an arbitrary function f(sk) of the secret key is leaked, as long as the secret key sk is still (exponentially) hard to compute from this auxiliary input. This setting of auxiliary input is more general than the more tr ..."
Abstract

Cited by 33 (4 self)
 Add to MetaCart
(Show Context)
We study the question of designing cryptographic schemes which are secure even if an arbitrary function f(sk) of the secret key is leaked, as long as the secret key sk is still (exponentially) hard to compute from this auxiliary input. This setting of auxiliary input is more general than the more traditional setting, which assumes that some of information about the secret key sk may be leaked, but sk still has high minentropy left. In particular, we deal with situations where f(sk) informationtheoretically determines the entire secret key sk. As our main result, we construct CPA/CCA secure symmetric encryption schemes that remain secure with exponentially hardtoinvert auxiliary input. We give several applications of such schemes. • We construct an averagecase obfuscator for the class of point functions, which remains secure with exponentially hardtoinvert auxiliary input, and is reusable. • We construct a reusable and robust extractor that remains secure with exponentially hardtoinvert auxiliary input. Our results rely on a new cryptographic assumption, Learning SubspacewithNoise (LSN), which is related to the well known Learning ParitywithNoise (LPN) assumption.
The Practical Subtleties of Biometric Key Generation
 In Proceedings of the 17 th Annual USENIX Security Symposium
, 2008
"... The inability of humans to generate and remember strong secrets makes it difficult for people to manage cryptographic keys. To address this problem, numerous proposals have been suggested to enable a human to repeatably generate a cryptographic key from her biometrics, where the strength of the key ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
(Show Context)
The inability of humans to generate and remember strong secrets makes it difficult for people to manage cryptographic keys. To address this problem, numerous proposals have been suggested to enable a human to repeatably generate a cryptographic key from her biometrics, where the strength of the key rests on the assumption that the measured biometrics have high entropy across the population. In this paper we show that, despite the fact that several researchers have examined the security of BKGs, the common techniques used to argue the security of practical systems are lacking. To address this issue we reexamine two well known, yet sometimes misunderstood, security requirements. We also present another that we believe has not received adequate attention in the literature, but is essential for practical biometric key generators. To demonstrate that each requirement has significant importance, we analyze three published schemes, and point out deficiencies in each. For example, in one case we show that failing to meet a requirement results in a construction where an attacker has a 22 % chance of finding ostensibly 43bit keys on her first guess. In another we show how an attacker who compromises a user’s cryptographic key can then infer that user’s biometric, thus revealing any other key generated using that biometric. We hope that by examining the pitfalls that occur continuously in the literature, we enable researchers and practitioners to more accurately analyze proposed constructions. 1
A unified approach to deterministic encryption: New constructions and a connection to computational entropy
 TCC 2012, volume 7194 of LNCS
, 2012
"... We propose a general construction of deterministic encryption schemes that unifies prior work and gives novel schemes. Specifically, its instantiations provide: • A construction from any trapdoor function that has sufficiently many hardcore bits. • A construction that provides “bounded ” multimessa ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
(Show Context)
We propose a general construction of deterministic encryption schemes that unifies prior work and gives novel schemes. Specifically, its instantiations provide: • A construction from any trapdoor function that has sufficiently many hardcore bits. • A construction that provides “bounded ” multimessage security from lossy trapdoor functions. The security proofs for these schemes are enabled by three tools that are of broader interest: • A weaker and more precise sufficient condition for semantic security on a highentropy message distribution. Namely, we show that to establish semantic security on a distribution M of messages, it suffices to establish indistinguishability for all conditional distribution ME, where E is an event of probability at least 1/4. (Prior work required indistinguishability on all distributions of a given entropy.) • A result about computational entropy of conditional distributions. Namely, we show that conditioning on an event E of probability p reduces the quality of computational entropy by a factor of p and its quantity by log 2 1/p. • A generalization of leftover hash lemma to correlated distributions. We also extend our result about computational entropy to the average case, which is useful in reasoning about leakageresilient cryptography: leaking λ bits of information reduces the quality of computational entropy by a factor of 2 λ and its quantity by λ.
Securely Obfuscating Reencryption
 Theory of Cryptography Conference TCC
, 2007
"... We present a positive obfuscation result for a traditional cryptographic functionality. This positive result stands in contrast to wellknown impossibility results [3] for general obfuscation and recent impossibility and improbability [13] results for obfuscation of many cryptographic functionalitie ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
(Show Context)
We present a positive obfuscation result for a traditional cryptographic functionality. This positive result stands in contrast to wellknown impossibility results [3] for general obfuscation and recent impossibility and improbability [13] results for obfuscation of many cryptographic functionalities. Whereas other positive obfuscation results in the standard model apply to very simple point functions, our obfuscation result applies to the significantly more complex and widelyused reencryption functionality. This functionality takes a ciphertext for message m encrypted under Alice’s public key and transforms it into a ciphertext for the same message m under Bob’s public key. To overcome impossibility results and to make our results meaningful for cryptographic functionalities, our scheme satisfies a definition of obfuscation which incorporates more securityaware provisions.
Structural Signatures for Tree Data Structures
, 2008
"... Data sharing with multiple parties over a thirdparty distribution framework requires that both data integrity and confidentiality be assured. One of the most widely used data organization structures is the tree structure. When such structures encode sensitive information (such as in XML documents), ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
Data sharing with multiple parties over a thirdparty distribution framework requires that both data integrity and confidentiality be assured. One of the most widely used data organization structures is the tree structure. When such structures encode sensitive information (such as in XML documents), it is crucial that integrity and confidentiality be assured not only for the content, but also for the structure. Digital signature schemes are commonly used to authenticate the integrity of the data. The most widely used such technique for tree structures is the Merkle hash technique, which however is known to be “not hiding”, thus leading to unauthorized leakage of information. Most techniques in the literature are based on the Merkle hash technique and thus suffer from the problem of unauthorized information leakages. Assurance of integrity and confidentiality (no leakages) of treestructured data is an important problem in the context of secure data publishing and content distribution systems. In this paper, we propose a signature scheme for tree structures, which assures both confidentiality and integrity and is also efficient, especially in thirdparty distribution environments. Our integrity assurance technique, which we refer to as the “Structural signature scheme”, is based on the structure of the tree as defined by tree traversals (preorder, postorder, inorder) and is defined using a randomized notion of such traversal numbers. In addition to formally defining the technique, we prove that it protects against violations of content and structural integrity and information leakages. We also show through complexity and performance analysis that the structural signature scheme is efficient; with respect to the Merkle hash technique, it incurs comparable cost for signing the trees and incurs lower cost for userside integrity verification.