Results 1  10
of
49
Boosting and differential privacy
, 2010
"... Abstract—Boosting is a general method for improving the accuracy of learning algorithms. We use boosting to construct improved privacypreserving synopses of an input database. These are data structures that yield, for a given set Q of queries over an input database, reasonably accurate estimates of ..."
Abstract

Cited by 293 (8 self)
 Add to MetaCart
Abstract—Boosting is a general method for improving the accuracy of learning algorithms. We use boosting to construct improved privacypreserving synopses of an input database. These are data structures that yield, for a given set Q of queries over an input database, reasonably accurate estimates of the responses to every query in Q, even when the number of queries is much larger than the number of rows in the database. Given a base synopsis generator that takes a distribution on Q and produces a “weak ” synopsis that yields “good ” answers for a majority of the weight in Q, our Boosting for Queries algorithm obtains a synopsis that is good for all of Q. We ensure privacy for the rows of the database, but the boosting is performed on the queries. We also provide the first synopsis generators for arbitrary sets of arbitrary lowsensitivity
Fuzzy extractors: How to generate strong keys from biometrics and other noisy data. Technical Report 2003/235, Cryptology ePrint archive, http://eprint.iacr.org, 2006. Previous version appeared at EUROCRYPT 2004
 34 [DRS07] [DS05] [EHMS00] [FJ01] Yevgeniy Dodis, Leonid Reyzin, and Adam
, 2004
"... We provide formal definitions and efficient secure techniques for • turning noisy information into keys usable for any cryptographic application, and, in particular, • reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying mater ..."
Abstract

Cited by 292 (35 self)
 Add to MetaCart
We provide formal definitions and efficient secure techniques for • turning noisy information into keys usable for any cryptographic application, and, in particular, • reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying material that, unlike traditional cryptographic keys, is (1) not reproducible precisely and (2) not distributed uniformly. We propose two primitives: a fuzzy extractor reliably extracts nearly uniform randomness R from its input; the extraction is errortolerant in the sense that R will be the same even if the input changes, as long as it remains reasonably close to the original. Thus, R can be used as a key in a cryptographic application. A secure sketch produces public information about its input w that does not reveal w, and yet allows exact recovery of w given another value that is close to w. Thus, it can be used to reliably reproduce errorprone biometric inputs without incurring the security risk inherent in storing them. We define the primitives to be both formally secure and versatile, generalizing much prior work. In addition, we provide nearly optimal constructions of both primitives for various measures of “closeness” of input data, such as Hamming distance, edit distance, and set difference.
A.: On notions of security for deterministic encryption, and efficient constructions without random oracles. Full version of this paper
, 2008
"... Abstract. The study of deterministic publickey encryption was initiated by Bellare et al. (CRYPTO ’07), who provided the “strongest possible ” notion of security for this primitive (called PRIV) and constructions in the random oracle (RO) model. We focus on constructing efficient deterministic encr ..."
Abstract

Cited by 44 (0 self)
 Add to MetaCart
Abstract. The study of deterministic publickey encryption was initiated by Bellare et al. (CRYPTO ’07), who provided the “strongest possible ” notion of security for this primitive (called PRIV) and constructions in the random oracle (RO) model. We focus on constructing efficient deterministic encryption schemes without random oracles. To do so, we propose a slightly weaker notion of security, saying that no partial information about encrypted messages should be leaked as long as each message is apriori hardtoguess given the others (while PRIV did not have the latter restriction). Nevertheless, we argue that this version seems adequate for many practical applications. We show equivalence of this definition to singlemessage and indistinguishabilitybased ones, which are easier to work with. Then we give general constructions of both chosenplaintext (CPA) and chosenciphertextattack (CCA) secure deterministic encryption schemes, as well as efficient instantiations of them under standard numbertheoretic assumptions. Our constructions build on the recentlyintroduced framework of Peikert and Waters (STOC ’08) for constructing CCAsecure probabilistic encryption schemes, extending it to the deterministicencryption setting as well.
Robust fuzzy extractors and authenticated key agreement from close secrets
 In Advances in Cryptology — Crypto 2006, volume 4117 of LNCS
, 2006
"... Consider two parties holding samples from correlated distributions W and W ′, respectively, where these samples are within distance t of each other in some metric space. The parties wish to agree on a closetouniformly distributed secret key R by sending a single message over an insecure channel co ..."
Abstract

Cited by 37 (16 self)
 Add to MetaCart
Consider two parties holding samples from correlated distributions W and W ′, respectively, where these samples are within distance t of each other in some metric space. The parties wish to agree on a closetouniformly distributed secret key R by sending a single message over an insecure channel controlled by an allpowerful adversary who may read and modify anything sent over the channel. We consider both the keyless case, where the parties share no additional secret information, and the keyed case, where the parties share a longterm secret SKBSM that they can use to generate a sequence of session keys {Rj} using multiple pairs {(Wj, W ′ j)}. The former has applications to, e.g., biometric authentication, while the latter arises in, e.g., the boundedstorage model with errors. We show solutions that improve upon previous work in several respects: • The best prior solution for the keyless case with no errors (i.e., t = 0) requires the minentropy of W to exceed 2n/3, where n is the bitlength of W. Our solution applies whenever the minentropy of W exceeds the minimal threshold n/2, and yields a longer key. • Previous solutions for the keyless case in the presence of errors (i.e., t> 0) required random oracles. We give the first constructions (for certain metrics) in the standard model. • Previous solutions for the keyed case were stateful. We give the first stateless solution. 1
On cryptography with auxiliary input
 DKL09] [DS05] [FGK+ 10] [FOR12] [GHV10
, 2009
"... We study the question of designing cryptographic schemes which are secure even if an arbitrary function f(sk) of the secret key is leaked, as long as the secret key sk is still (exponentially) hard to compute from this auxiliary input. This setting of auxiliary input is more general than the more tr ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
We study the question of designing cryptographic schemes which are secure even if an arbitrary function f(sk) of the secret key is leaked, as long as the secret key sk is still (exponentially) hard to compute from this auxiliary input. This setting of auxiliary input is more general than the more traditional setting, which assumes that some of information about the secret key sk may be leaked, but sk still has high minentropy left. In particular, we deal with situations where f(sk) informationtheoretically determines the entire secret key sk. As our main result, we construct CPA/CCA secure symmetric encryption schemes that remain secure with exponentially hardtoinvert auxiliary input. We give several applications of such schemes. • We construct an averagecase obfuscator for the class of point functions, which remains secure with exponentially hardtoinvert auxiliary input, and is reusable. • We construct a reusable and robust extractor that remains secure with exponentially hardtoinvert auxiliary input. Our results rely on a new cryptographic assumption, Learning SubspacewithNoise (LSN), which is related to the well known Learning ParitywithNoise (LPN) assumption.
Rothblum: On BestPossible Obfuscation
 TCC
, 2007
"... Abstract. An obfuscator is a compiler that transforms any program (which we will view in this work as a boolean circuit) into an obfuscated program (also a circuit) that has the same inputoutput functionality as the original program, but is “unintelligible”. Obfuscation has applications for cryptog ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
Abstract. An obfuscator is a compiler that transforms any program (which we will view in this work as a boolean circuit) into an obfuscated program (also a circuit) that has the same inputoutput functionality as the original program, but is “unintelligible”. Obfuscation has applications for cryptography and for software protection. Barak et al. initiated a theoretical study of obfuscation, which focused on blackbox obfuscation, where the obfuscated circuit should leak no information except for its (blackbox) inputoutput functionality. A family of functionalities that cannot be obfuscated was demonstrated. Subsequent research has showed further negative results as well as positive results for obfuscating very specific families of circuits, all with respect to black box obfuscation. This work is a study of a new notion of obfuscation, which we call bestpossible obfuscation. Best possible obfuscation makes the relaxed requirement that the obfuscated program leaks as little information as
The Practical Subtleties of Biometric Key Generation
 In Proceedings of the 17 th Annual USENIX Security Symposium
, 2008
"... The inability of humans to generate and remember strong secrets makes it difficult for people to manage cryptographic keys. To address this problem, numerous proposals have been suggested to enable a human to repeatably generate a cryptographic key from her biometrics, where the strength of the key ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
The inability of humans to generate and remember strong secrets makes it difficult for people to manage cryptographic keys. To address this problem, numerous proposals have been suggested to enable a human to repeatably generate a cryptographic key from her biometrics, where the strength of the key rests on the assumption that the measured biometrics have high entropy across the population. In this paper we show that, despite the fact that several researchers have examined the security of BKGs, the common techniques used to argue the security of practical systems are lacking. To address this issue we reexamine two well known, yet sometimes misunderstood, security requirements. We also present another that we believe has not received adequate attention in the literature, but is essential for practical biometric key generators. To demonstrate that each requirement has significant importance, we analyze three published schemes, and point out deficiencies in each. For example, in one case we show that failing to meet a requirement results in a construction where an attacker has a 22 % chance of finding ostensibly 43bit keys on her first guess. In another we show how an attacker who compromises a user’s cryptographic key can then infer that user’s biometric, thus revealing any other key generated using that biometric. We hope that by examining the pitfalls that occur continuously in the literature, we enable researchers and practitioners to more accurately analyze proposed constructions. 1
Securely Obfuscating Reencryption
 Theory of Cryptography Conference TCC
, 2007
"... We present a positive obfuscation result for a traditional cryptographic functionality. This positive result stands in contrast to wellknown impossibility results [3] for general obfuscation and recent impossibility and improbability [13] results for obfuscation of many cryptographic functionalitie ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
We present a positive obfuscation result for a traditional cryptographic functionality. This positive result stands in contrast to wellknown impossibility results [3] for general obfuscation and recent impossibility and improbability [13] results for obfuscation of many cryptographic functionalities. Whereas other positive obfuscation results in the standard model apply to very simple point functions, our obfuscation result applies to the significantly more complex and widelyused reencryption functionality. This functionality takes a ciphertext for message m encrypted under Alice’s public key and transforms it into a ciphertext for the same message m under Bob’s public key. To overcome impossibility results and to make our results meaningful for cryptographic functionalities, our scheme satisfies a definition of obfuscation which incorporates more securityaware provisions.
On pseudorandom generators with linear stretch in NC0
 in NC 0 . In Proc. 10th Random
, 2006
"... Abstract We consider the question of constructing cryptographic pseudorandomgenerators (PRGs) in NC 0, namely ones in which each bit of the output depends on just a constant number of input bits. Previous constructions of suchPRGs were limited to stretching a seed of n bits to n + o(n) bits. This le ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
Abstract We consider the question of constructing cryptographic pseudorandomgenerators (PRGs) in NC 0, namely ones in which each bit of the output depends on just a constant number of input bits. Previous constructions of suchPRGs were limited to stretching a seed of n bits to n + o(n) bits. This leavesopen the existence of a PRG with a linear (let alone superlinear) stretch in NC0. In this work we study this question and obtain the following mainresults: 1. We show that the existence of a linearstretch PRG in NC0 impliesnontrivial hardness of approximation results without relying on PCP machinery. In particular, it implies that Max3SAT is hard to approximate to within some multiplicative constant. 2. We construct a linearstretch PRG in NC0 under a specific intractability assumption related to the hardness of decoding "sparsely generated " linear codes. Such an assumption was previously conjectured byAlekhnovich (FOCS 2003).
On the Difficulties of Disclosure Prevention in Statistical Databases or The Case for Differential Privacy
, 2008
"... In 1977 Tore Dalenius articulated a desideratum for statistical databases: nothing about an individual should be learnable from the database that cannot be learned without access to the database. We give a general impossibility result showing that a natural formalization of Dalenius’ goal cannot be ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
In 1977 Tore Dalenius articulated a desideratum for statistical databases: nothing about an individual should be learnable from the database that cannot be learned without access to the database. We give a general impossibility result showing that a natural formalization of Dalenius’ goal cannot be achieved if the database is useful. The key obstacle is the side information that may be available to an adversary. Our results hold under very general conditions regarding the database, the notion of privacy violation, and the notion of utility. Contrary to intuition, a variant of the result threatens the privacy even of someone not in the database. This state of affairs motivated the notion of differential privacy [15, 16], a strong ad omnia privacy which, intuitively, captures the increased risk to one’s privacy incurred by participating in a database. 1