Results 1  10
of
50
Lossy Trapdoor Functions and Their Applications
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY, REPORT NO. 80 (2007)
, 2007
"... We propose a new general primitive called lossy trapdoor functions (lossy TDFs), and realize it under a variety of different number theoretic assumptions, including hardness of the decisional DiffieHellman (DDH) problem and the worstcase hardness of standard lattice problems. Using lossy TDFs, we ..."
Abstract

Cited by 79 (17 self)
 Add to MetaCart
We propose a new general primitive called lossy trapdoor functions (lossy TDFs), and realize it under a variety of different number theoretic assumptions, including hardness of the decisional DiffieHellman (DDH) problem and the worstcase hardness of standard lattice problems. Using lossy TDFs, we develop a new approach for constructing many important cryptographic primitives, including standard trapdoor functions, CCAsecure cryptosystems, collisionresistant hash functions, and more. All of our constructions are simple, efficient, and blackbox. Taken all together, these results resolve some longstanding open problems in cryptography. They give the first known (injective) trapdoor functions based on problems not directly related to integer factorization, and provide the first known CCAsecure cryptosystem based solely on worstcase lattice assumptions.
Bonsai Trees, or How to Delegate a Lattice Basis
, 2010
"... We introduce a new latticebased cryptographic structure called a bonsai tree, and use it to resolve some important open problems in the area. Applications of bonsai trees include: • An efficient, stateless ‘hashandsign ’ signature scheme in the standard model (i.e., no random oracles), and • The ..."
Abstract

Cited by 65 (5 self)
 Add to MetaCart
We introduce a new latticebased cryptographic structure called a bonsai tree, and use it to resolve some important open problems in the area. Applications of bonsai trees include: • An efficient, stateless ‘hashandsign ’ signature scheme in the standard model (i.e., no random oracles), and • The first hierarchical identitybased encryption (HIBE) scheme (also in the standard model) that does not rely on bilinear pairings. Interestingly, the abstract properties of bonsai trees seem to have no known realization in conventional numbertheoretic cryptography. 1
Efficient lattice (H)IBE in the standard model
 In EUROCRYPT 2010, LNCS
, 2010
"... Abstract. We construct an efficient identity based encryption system based on the standard learning with errors (LWE) problem. Our security proof holds in the standard model. The key step in the construction is a family of lattices for which there are two distinct trapdoors for finding short vectors ..."
Abstract

Cited by 52 (10 self)
 Add to MetaCart
Abstract. We construct an efficient identity based encryption system based on the standard learning with errors (LWE) problem. Our security proof holds in the standard model. The key step in the construction is a family of lattices for which there are two distinct trapdoors for finding short vectors. One trapdoor enables the real system to generate short vectors in all lattices in the family. The other trapdoor enables the simulator to generate short vectors for all lattices in the family except for one. We extend this basic technique to an adaptivelysecure IBE and a Hierarchical IBE. 1
GENERATING SHORTER BASES FOR HARD RANDOM LATTICES
, 2009
"... We revisit the problem of generating a “hard” random lattice together with a basis of relatively short vectors. This problem has gained in importance lately due to new cryptographic schemes that use such a procedure for generating public/secret key pairs. In these applications, a shorter basis dire ..."
Abstract

Cited by 38 (6 self)
 Add to MetaCart
We revisit the problem of generating a “hard” random lattice together with a basis of relatively short vectors. This problem has gained in importance lately due to new cryptographic schemes that use such a procedure for generating public/secret key pairs. In these applications, a shorter basis directly corresponds to milder underlying complexity assumptions and smaller key sizes. The contributions of this work are twofold. First, using the Hermite normal form as an organizing principle, we simplify and generalize an approach due to Ajtai (ICALP 1999). Second, we improve the construction and its analysis in several ways, most notably by tightening the length of the output basis essentially to the optimum value.
Lattice basis delegation in fixed dimension and shorterciphertext hierarchical IBE
 In Advances in Cryptology — CRYPTO 2010, Springer LNCS 6223
, 2010
"... Abstract. We present a technique for delegating a short lattice basis that has the advantage of keeping the lattice dimension unchanged upon delegation. Building on this result, we construct two new hierarchical identitybased encryption (HIBE) schemes, with and without random oracles. The resulting ..."
Abstract

Cited by 30 (7 self)
 Add to MetaCart
Abstract. We present a technique for delegating a short lattice basis that has the advantage of keeping the lattice dimension unchanged upon delegation. Building on this result, we construct two new hierarchical identitybased encryption (HIBE) schemes, with and without random oracles. The resulting systems are very different from earlier latticebased HIBEs and in some cases result in shorter ciphertexts and private keys. We prove security from classic lattice hardness assumptions. 1
Candidate Multilinear Maps from Ideal Lattices and Applications
, 2012
"... Wedescribeplausiblelatticebasedconstructionswithpropertiesthatapproximatethesoughtafter multilinear maps in harddiscretelogarithm groups, and show that some applications of such multilinear maps can be realized using our approximations. The security of our constructions relies on seemingly hard ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
Wedescribeplausiblelatticebasedconstructionswithpropertiesthatapproximatethesoughtafter multilinear maps in harddiscretelogarithm groups, and show that some applications of such multilinear maps can be realized using our approximations. The security of our constructions relies on seemingly hard problems in ideal lattices, which can be viewed as extensions of the assumed hardness of the NTRU function.
Limits on the hardness of lattice problems in ℓp norms
 In IEEE Conference on Computational Complexity
, 2007
"... In recent years, several papers have established limits on the computational difficulty of lattice problems, focusing primarily on the ℓ2 (Euclidean) norm. We demonstrate close analogues of these results in ℓp norms, for every 2 < p ≤ ∞. In particular, for lattices of dimension n: • Approximating th ..."
Abstract

Cited by 18 (11 self)
 Add to MetaCart
In recent years, several papers have established limits on the computational difficulty of lattice problems, focusing primarily on the ℓ2 (Euclidean) norm. We demonstrate close analogues of these results in ℓp norms, for every 2 < p ≤ ∞. In particular, for lattices of dimension n: • Approximating the closest vector problem, the shortest vector problem, and other related problems to within O ( √ n) factors (or O ( √ n log n) factors, for p = ∞) is in coNP. • Approximating the closest vector and bounded distance decoding problems with preprocessing to within O ( √ n) factors can be accomplished in deterministic polynomial time. • Approximating several problems (such as the shortest independent vectors problem) to within Õ(n) factors in the worst case reduces to solving the averagecase problems defined in prior works (Ajtai, STOC 1996; Micciancio and Regev, SIAM J. on Computing 2007; Regev, STOC 2005). Our results improve prior approximation factors for ℓp norms by up to √ n factors. Taken all together, they complement recent reductions from the ℓ2 norm to ℓp norms (Regev and Rosen, STOC 2006), and provide some evidence that lattice problems in ℓp norms (for p> 2) may not be substantially harder than they are in the ℓ2 norm. One of our main technical contributions is a very general analysis of Gaussian distributions over lattices, which may be of independent interest. Our proofs employ analytical techniques of Banaszczyk that, to our knowledge, have yet to be exploited in computer science. 1
Realizing hashandsign signatures under standard assumptions
 In Advances in Cryptology – EUROCRYPT ’09, volume 5479 of LNCS
, 2009
"... Currently, there are relatively few instances of “hashandsign ” signatures in the standard model. Moreover, most current instances rely on strong and less studied assumptions such as the Strong RSA and qStrong DiffieHellman assumptions. In this paper, we present a new approach for realizing hash ..."
Abstract

Cited by 18 (6 self)
 Add to MetaCart
Currently, there are relatively few instances of “hashandsign ” signatures in the standard model. Moreover, most current instances rely on strong and less studied assumptions such as the Strong RSA and qStrong DiffieHellman assumptions. In this paper, we present a new approach for realizing hashandsign signatures in the standard model. In our approach, a signer associates each signature with an index i that represents how many signatures that signer has issued up to that point. Then, to make use of this association, we create simple and efficient techniques that restrict an adversary which makes q signature requests to forge on an index no greater than 2 ⌈lg(q) ⌉ < 2q. Finally, we develop methods for dealing with this restricted adversary. Our approach requires that the signer maintain a small amount of state — a counter of the number of signatures issued. We achieve two new realizations for hashandsign signatures respectively based on the RSA assumption and the Computational DiffieHellman assumption in bilinear groups. 1
Short and stateless signatures from the RSA assumption
 In Proceedings of Advances in Cryptology, CRYPTO
"... We present the first signature scheme which is “short”, stateless and secure under the RSA assumption in the standard model. Prior short, standard model signatures in the RSA setting required either a strong complexity assumption such as Strong RSA or (recently) that the signer maintain state. A sig ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
We present the first signature scheme which is “short”, stateless and secure under the RSA assumption in the standard model. Prior short, standard model signatures in the RSA setting required either a strong complexity assumption such as Strong RSA or (recently) that the signer maintain state. A signature in our scheme is comprised of one element in Z ∗ N and one integer. The public key is also short, requiring only the modulus N, one element of Z ∗ N, one integer, one PRF seed and some short chameleon hash parameters. To design our signature, we employ the known generic construction of fullysecure signatures from weaklysecure signatures and a chameleon hash. We then introduce a new proof technique for reasoning about weaklysecure signatures. This technique enables the simulator to predict a prefix of the message on which the adversary will forge and to use knowledge of this prefix to embed the challenge. This technique has wider applications beyond RSA. We also use it to provide an entirely new analysis of the security of the Waters signatures: the only short, stateless signatures known to be secure under the Computational DiffieHellman assumption in the standard model. 1
Publickey encryption schemes with auxiliary inputs
 In TCC. 2010. [Fei02] U. Feige. Relations
"... Abstract. We construct publickey cryptosystems that remain secure even when the adversary is given any computationally uninvertible function of the secret key as auxiliary input (even one that may reveal the secret key informationtheoretically). Our schemes are based on the decisional DiffieHellma ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
Abstract. We construct publickey cryptosystems that remain secure even when the adversary is given any computationally uninvertible function of the secret key as auxiliary input (even one that may reveal the secret key informationtheoretically). Our schemes are based on the decisional DiffieHellman (DDH) and the Learning with Errors (LWE) problems. As an independent technical contribution, we extend the GoldreichLevin theorem to provide a hardcore (pseudorandom) value over large fields. 1