Results 1  10
of
344
Fuzzy extractors: How to generate strong keys from biometrics and other noisy data. Technical Report 2003/235, Cryptology ePrint archive, http://eprint.iacr.org, 2006. Previous version appeared at EUROCRYPT 2004
 34 [DRS07] [DS05] [EHMS00] [FJ01] Yevgeniy Dodis, Leonid Reyzin, and Adam
, 2004
"... We provide formal definitions and efficient secure techniques for • turning noisy information into keys usable for any cryptographic application, and, in particular, • reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying mater ..."
Abstract

Cited by 532 (38 self)
 Add to MetaCart
We provide formal definitions and efficient secure techniques for • turning noisy information into keys usable for any cryptographic application, and, in particular, • reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying material that, unlike traditional cryptographic keys, is (1) not reproducible precisely and (2) not distributed uniformly. We propose two primitives: a fuzzy extractor reliably extracts nearly uniform randomness R from its input; the extraction is errortolerant in the sense that R will be the same even if the input changes, as long as it remains reasonably close to the original. Thus, R can be used as a key in a cryptographic application. A secure sketch produces public information about its input w that does not reveal w, and yet allows exact recovery of w given another value that is close to w. Thus, it can be used to reliably reproduce errorprone biometric inputs without incurring the security risk inherent in storing them. We define the primitives to be both formally secure and versatile, generalizing much prior work. In addition, we provide nearly optimal constructions of both primitives for various measures of “closeness” of input data, such as Hamming distance, edit distance, and set difference.
A Fuzzy Commitment Scheme
 ACM CCS'99
, 1999
"... We combine wellknown techniques from the areas of errorcorrecting codes and cryptography to achieve a new type of cryptographic primitive that we refer to as a fuzzy commitment scheme. Like a conventional cryptographic commitment scheme, our fuzzy commitment scheme is both concealing and binding: i ..."
Abstract

Cited by 327 (1 self)
 Add to MetaCart
(Show Context)
We combine wellknown techniques from the areas of errorcorrecting codes and cryptography to achieve a new type of cryptographic primitive that we refer to as a fuzzy commitment scheme. Like a conventional cryptographic commitment scheme, our fuzzy commitment scheme is both concealing and binding: it is infeasible for an attacker to learn the committed value, and also for the committer to decommit a value in more than one way. In a conventional scheme, a commitment must be opened using a unique witness, which acts, essentially, as a decryption key. By contrast, our scheme is fuzzy in the sense that it accepts a witness that is close to the original encrypting witness in a suitable metric, but not necessarily identical. This characteristic of our fuzzy commitment scheme makes it useful for applications such as biometric authentication systems, in which data is subject to random noise. Because the scheme is tolerant of error, it is capable of protecting biometric data just as conventional cryptographic techniques, like hash functions, are used to protect alphanumeric passwords. This addresses a major outstanding problem in the theory of biometric authentication. We prove the security characteristics of our fuzzy commitment scheme relative to the properties of an underlying cryptographic hash function.
A fuzzy vault scheme
 In International Symposium on Information Theory (ISIT
, 2002
"... Abstract. We describe a simple and novel cryptographic construction that we refer to as a fuzzy vault. A player Alice may place a secret value κ in a fuzzy vault and “lock ” it using a set A of elements from some public universe U. If Bob tries to “unlock ” the vault using a set B of similar length, ..."
Abstract

Cited by 292 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We describe a simple and novel cryptographic construction that we refer to as a fuzzy vault. A player Alice may place a secret value κ in a fuzzy vault and “lock ” it using a set A of elements from some public universe U. If Bob tries to “unlock ” the vault using a set B of similar length, he obtains κ only if B is close to A, i.e., only if A and B overlap substantially. In constrast to previous constructions of this flavor, ours possesses the useful feature of order invariance, meaning that the ordering of A and B is immaterial to the functioning of the vault. As we show, our scheme enjoys provable security against a computationally unbounded attacker.
Revealing information while preserving privacy
 In PODS
, 2003
"... We examine the tradeoff between privacy and usability of statistical databases. We model a statistical database by an nbit string d1,.., dn, with a query being a subset q ⊆ [n] to be answered by � i∈q di. Our main result is a polynomial reconstruction algorithm of data from noisy (perturbed) subset ..."
Abstract

Cited by 268 (10 self)
 Add to MetaCart
(Show Context)
We examine the tradeoff between privacy and usability of statistical databases. We model a statistical database by an nbit string d1,.., dn, with a query being a subset q ⊆ [n] to be answered by � i∈q di. Our main result is a polynomial reconstruction algorithm of data from noisy (perturbed) subset sums. Applying this reconstruction algorithm to statistical databases we show that in order to achieve privacy one has to add perturbation of magnitude Ω ( √ n). That is, smaller perturbation always results in a strong violation of privacy. We show that this result is tight by exemplifying access algorithms for statistical databases that preserve privacy while adding perturbation of magnitude Õ(√n). For timeT bounded adversaries we demonstrate a privacypreserving access algorithm whose perturbation magnitude is ≈ √ T. 1
An algebraic approach to IP traceback
 ACM Transactions on Information and System Security
, 2002
"... We present a new solution to the problem of determining the path a packet traversed over the Internet (called the traceback problem) during a denial of service attack. This paper reframes the traceback problem as a polynomial reconstruction problem and uses algebraic techniques from coding theory an ..."
Abstract

Cited by 226 (0 self)
 Add to MetaCart
(Show Context)
We present a new solution to the problem of determining the path a packet traversed over the Internet (called the traceback problem) during a denial of service attack. This paper reframes the traceback problem as a polynomial reconstruction problem and uses algebraic techniques from coding theory and learning theory to provide robust methods of transmission and reconstruction. 1
Algebraic SoftDecision Decoding of ReedSolomon Codes
 IEEE Trans. Inform. Theory
, 2001
"... A polynomialtime softdecision decoding algorithm for ReedSolomon codes is developed. ..."
Abstract

Cited by 157 (14 self)
 Add to MetaCart
(Show Context)
A polynomialtime softdecision decoding algorithm for ReedSolomon codes is developed.
Password Hardening Based on Keystroke Dynamics
 International Journal of Information Security
, 1999
"... Abstract. We present a novel approach to improving the security of passwords. In our approach, the legitimate user’s typing patterns (e.g., durations of keystrokes and latencies between keystrokes) are combined with the user’s password to generate a hardened password that is convincingly more secure ..."
Abstract

Cited by 143 (8 self)
 Add to MetaCart
(Show Context)
Abstract. We present a novel approach to improving the security of passwords. In our approach, the legitimate user’s typing patterns (e.g., durations of keystrokes and latencies between keystrokes) are combined with the user’s password to generate a hardened password that is convincingly more secure than conventional passwords alone. In addition, our scheme automatically adapts to gradual changes in a user’s typing patterns while maintaining the same hardened password across multiple logins, for use in file encryption or other applications requiring a longterm secret key. Using empirical data and a prototype implementation of our scheme, we give evidence that our approach is viable in practice, in terms of ease of use, improved security, and performance.
Pseudorandom generators without the XOR Lemma
, 1998
"... Madhu Sudan y Luca Trevisan z Salil Vadhan x Abstract Impagliazzo and Wigderson [IW97] have recently shown that if there exists a decision problem solvable in time 2 O(n) and having circuit complexity 2 n) (for all but finitely many n) then P = BPP. This result is a culmination of a serie ..."
Abstract

Cited by 137 (21 self)
 Add to MetaCart
Madhu Sudan y Luca Trevisan z Salil Vadhan x Abstract Impagliazzo and Wigderson [IW97] have recently shown that if there exists a decision problem solvable in time 2 O(n) and having circuit complexity 2 n) (for all but finitely many n) then P = BPP. This result is a culmination of a series of works showing connections between the existence of hard predicates and the existence of good pseudorandom generators. The construction of Impagliazzo and Wigderson goes through three phases of "hardness amplification" (a multivariate polynomial encoding, a first derandomized XOR Lemma, and a second derandomized XOR Lemma) that are composed with the Nisan Wigderson [NW94] generator. In this paper we present two different approaches to proving the main result of Impagliazzo and Wigderson. In developing each approach, we introduce new techniques and prove new results that could be useful in future improvements and/or applications of hardnessrandomness tradeoffs. Our first result is that when (a modified version of) the NisanWigderson generator construction is applied with a "mildly" hard predicate, the result is a generator that produces a distribution indistinguishable from having large minentropy. An extractor can then be used to produce a distribution computationally indistinguishable from uniform. This is the first construction of a pseudorandom generator that works with a mildly hard predicate without doing hardness amplification. We then show that in the ImpagliazzoWigderson construction only the first hardnessamplification phase (encoding with multivariate polynomial) is necessary, since it already gives the required averagecase hardness. We prove this result by (i) establishing a connection between the hardnessamplification problem and a listdecoding...
Unbalanced expanders and randomness extractors from parvareshvardy codes
 In Proceedings of the 22nd Annual IEEE Conference on Computational Complexity
, 2007
"... We give an improved explicit construction of highly unbalanced bipartite expander graphs with expansion arbitrarily close to the degree (which is polylogarithmic in the number of vertices). Both the degree and the number of righthand vertices are polynomially close to optimal, whereas the previous ..."
Abstract

Cited by 126 (7 self)
 Add to MetaCart
(Show Context)
We give an improved explicit construction of highly unbalanced bipartite expander graphs with expansion arbitrarily close to the degree (which is polylogarithmic in the number of vertices). Both the degree and the number of righthand vertices are polynomially close to optimal, whereas the previous constructions of TaShma, Umans, and Zuckerman (STOC ‘01) required at least one of these to be quasipolynomial in the optimal. Our expanders have a short and selfcontained description and analysis, based on the ideas underlying the recent listdecodable errorcorrecting codes of Parvaresh and Vardy (FOCS ‘05). Our expanders can be interpreted as nearoptimal “randomness condensers, ” that reduce the task of extracting randomness from sources of arbitrary minentropy rate to extracting randomness from sources of minentropy rate arbitrarily close to 1, which is a much easier task. Using this connection, we obtain a new construction of randomness extractors that is optimal up to constant factors, while being much simpler than the previous construction of Lu et al. (STOC ‘03) and improving upon it when the error parameter is small (e.g. 1/poly(n)).
An Efficient Public Key Traitor Tracing Scheme (Extended Abstract)
 Cryptology  Crypto '99, Springr LNCS 1666
, 1999
"... We construct a public key encryption scheme in which there is one public encryption key, but many private decryption keys. If some digital content (e.g., a music clip) is encrypted using the public key and distributed through a broadcast channel, then each legitimate user can decrypt using its own p ..."
Abstract

Cited by 101 (4 self)
 Add to MetaCart
We construct a public key encryption scheme in which there is one public encryption key, but many private decryption keys. If some digital content (e.g., a music clip) is encrypted using the public key and distributed through a broadcast channel, then each legitimate user can decrypt using its own private key. Furthermore, if a coalition of users collude to create a new decryption key then there is an efficient algorithm to trace the new key to its creators. Hence, our system provides a simple and efficient solution to the "traitor tracing problem". Our tracing algorithm is deterministic, and catches all active traitors while never accusing innocent users, although it is only partially "black box". A minor modification to the scheme enables it to resist an adaptive chosen ciphertext attack. Our techniques apply error correcting codes to the discrete log representation problem.