• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

Lossy Trapdoor Functions and Their Applications (2007)

by Chris Peikert, Brent Waters
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 126
Next 10 →

Fully homomorphic encryption using ideal lattices

by Craig Gentry - In Proc. STOC , 2009
"... We propose a fully homomorphic encryption scheme – i.e., a scheme that allows one to evaluate circuits over encrypted data without being able to decrypt. Our solution comes in three steps. First, we provide a general result – that, to construct an encryption scheme that permits evaluation of arbitra ..."
Abstract - Cited by 663 (17 self) - Add to MetaCart
We propose a fully homomorphic encryption scheme – i.e., a scheme that allows one to evaluate circuits over encrypted data without being able to decrypt. Our solution comes in three steps. First, we provide a general result – that, to construct an encryption scheme that permits evaluation of arbitrary circuits, it suffices to construct an encryption scheme that can evaluate (slightly augmented versions of) its own decryption circuit; we call a scheme that can evaluate its (augmented) decryption circuit bootstrappable. Next, we describe a public key encryption scheme using ideal lattices that is almost bootstrappable. Lattice-based cryptosystems typically have decryption algorithms with low circuit complexity, often dominated by an inner product computation that is in NC1. Also, ideal lattices provide both additive and multiplicative homomorphisms (modulo a public-key ideal in a polynomial ring that is represented as a lattice), as needed to evaluate general circuits. Unfortunately, our initial scheme is not quite bootstrappable – i.e., the depth that the scheme can correctly evaluate can be logarithmic in the lattice dimension, just like the depth of the decryption circuit, but the latter is greater than the former. In the final step, we show how to modify the scheme to reduce the depth of the decryption circuit, and thereby obtain a bootstrappable encryption scheme, without reducing the depth that the scheme can evaluate. Abstractly, we accomplish this by enabling the encrypter to start the decryption process, leaving less work for the decrypter, much like the server leaves less work for the decrypter in a server-aided cryptosystem.
(Show Context)

Citation Context

...oofs of semantic security are Benaloh [8], NaccacheStern [42], Okamoto-Uchiyama [46], Paillier [47], and DamgardJurik [19]. Some additively homomorphic encryption schemes use lattices or linear codes =-=[22, 50, 27, 36, 37, 4]-=-. ElGamal [20] is multiplicatively homomorphic. Semantically secure schemes that allow both addition and multiplication include Boneh-Goh-Nissim [11] (quadratic formulas) and “Polly Cracker” by Fellow...

On Lattices, Learning with Errors, Random Linear Codes, and Cryptography

by Oded Regev - In STOC , 2005
"... Our main result is a reduction from worst-case lattice problems such as SVP and SIVP to a certain learning problem. This learning problem is a natural extension of the ‘learning from parity with error’ problem to higher moduli. It can also be viewed as the problem of decoding from a random linear co ..."
Abstract - Cited by 364 (6 self) - Add to MetaCart
Our main result is a reduction from worst-case lattice problems such as SVP and SIVP to a certain learning problem. This learning problem is a natural extension of the ‘learning from parity with error’ problem to higher moduli. It can also be viewed as the problem of decoding from a random linear code. This, we believe, gives a strong indication that these problems are hard. Our reduction, however, is quantum. Hence, an efficient solution to the learning problem implies a quantum algorithm for SVP and SIVP. A main open question is whether this reduction can be made classical. We also present a (classical) public-key cryptosystem whose security is based on the hardness of the learning problem. By the main result, its security is also based on the worst-case quantum hardness of SVP and SIVP. Previous lattice-based public-key cryptosystems such as the one by Ajtai and Dwork were based only on unique-SVP, a special case of SVP. The new cryptosystem is much more efficient than previous cryptosystems: the public key is of size Õ(n2) and encrypting a message increases its size by a factor of Õ(n) (in previous cryptosystems these values are Õ(n4) and Õ(n2), respectively). In fact, under the assumption that all parties share a random bit string of length Õ(n2), the size of the public key can be reduced to Õ(n). 1
(Show Context)

Citation Context

...ost the entire secret key is leaked. Another line of work focussed on the design of other cryptographic protocols whose security is based on the hardness of the LWE problem. First, Peikert and Waters =-=[35]-=- constructed, among other things, CCA5secure cryptosystems (see also [33] for a simpler construction). These are cryptosystems that are secure even if the adversary is allowed access to a decryption ...

A fully homomorphic encryption scheme

by Craig Gentry , 2009
"... ..."
Abstract - Cited by 208 (9 self) - Add to MetaCart
Abstract not found

Trapdoors for Hard Lattices and New Cryptographic Constructions

by Craig Gentry, Chris Peikert, Vinod Vaikuntanathan , 2007
"... We show how to construct a variety of “trapdoor ” cryptographic tools assuming the worstcase hardness of standard lattice problems (such as approximating the shortest nonzero vector to within small factors). The applications include trapdoor functions with preimage sampling, simple and efficient “ha ..."
Abstract - Cited by 191 (26 self) - Add to MetaCart
We show how to construct a variety of “trapdoor ” cryptographic tools assuming the worstcase hardness of standard lattice problems (such as approximating the shortest nonzero vector to within small factors). The applications include trapdoor functions with preimage sampling, simple and efficient “hash-and-sign ” digital signature schemes, universally composable oblivious transfer, and identity-based encryption. A core technical component of our constructions is an efficient algorithm that, given a basis of an arbitrary lattice, samples lattice points from a Gaussian-like probability distribution whose standard deviation is essentially the length of the longest vector in the basis. In particular, the crucial security property is that the output distribution of the algorithm is oblivious to the particular geometry of the given basis. ∗ Supported by the Herbert Kunzel Stanford Graduate Fellowship. † This material is based upon work supported by the National Science Foundation under Grants CNS-0716786 and CNS-0749931. Any opinions, findings, and conclusions or recommedations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. ‡ The majority of this work was performed while at SRI International. 1 1

Public-key cryptosystems from the worst-case shortest vector problem

by Chris Peikert , 2008
"... We construct public-key cryptosystems that are secure assuming the worst-case hardness of approximating the length of a shortest nonzero vector in an n-dimensional lattice to within a small poly(n) factor. Prior cryptosystems with worst-case connections were based either on the shortest vector probl ..."
Abstract - Cited by 152 (22 self) - Add to MetaCart
We construct public-key cryptosystems that are secure assuming the worst-case hardness of approximating the length of a shortest nonzero vector in an n-dimensional lattice to within a small poly(n) factor. Prior cryptosystems with worst-case connections were based either on the shortest vector problem for a special class of lattices (Ajtai and Dwork, STOC 1997; Regev, J. ACM 2004), or on the conjectured hardness of lattice problems for quantum algorithms (Regev, STOC 2005). Our main technical innovation is a reduction from certain variants of the shortest vector problem to corresponding versions of the “learning with errors” (LWE) problem; previously, only a quantum reduction of this kind was known. In addition, we construct new cryptosystems based on the search version of LWE, including a very natural chosen ciphertext-secure system that has a much simpler description and tighter underlying worst-case approximation factor than prior constructions.
(Show Context)

Citation Context

...larger q). The LWE problem is amazingly versatile. In addition to its first application in a public-key cryptosystem [Reg05], it has provided the foundation for chosen ciphertext-secure cryptosystems =-=[PW08]-=-, identitybased encryption [GPV08], and universally composable oblivious transfer [PVW08], as well as for strong hardness of learning results relating to halfspaces [KS06]. We emphasize that all of th...

On ideal lattices and learning with errors over rings

by Vadim Lyubashevsky, Chris Peikert, Oded Regev - In Proc. of EUROCRYPT, volume 6110 of LNCS , 2010
"... The “learning with errors ” (LWE) problem is to distinguish random linear equations, which have been perturbed by a small amount of noise, from truly uniform ones. The problem has been shown to be as hard as worst-case lattice problems, and in recent years it has served as the foundation for a pleth ..."
Abstract - Cited by 125 (18 self) - Add to MetaCart
The “learning with errors ” (LWE) problem is to distinguish random linear equations, which have been perturbed by a small amount of noise, from truly uniform ones. The problem has been shown to be as hard as worst-case lattice problems, and in recent years it has served as the foundation for a plethora of cryptographic applications. Unfortunately, these applications are rather inefficient due to an inherent quadratic overhead in the use of LWE. A main open question was whether LWE and its applications could be made truly efficient by exploiting extra algebraic structure, as was done for lattice-based hash functions (and related primitives). We resolve this question in the affirmative by introducing an algebraic variant of LWE called ring-LWE, and proving that it too enjoys very strong hardness guarantees. Specifically, we show that the ring-LWE distribution is pseudorandom, assuming that worst-case problems on ideal lattices are hard for polynomial-time quantum algorithms. Applications include the first truly practical lattice-based public-key cryptosystem with an efficient security reduction; moreover, many of the other applications of LWE can be made much more efficient through the use of ring-LWE. 1

A Framework for Efficient and Composable Oblivious Transfer

by Chris Peikert, Vinod Vaikuntanathan, Brent Waters , 2008
"... ..."
Abstract - Cited by 119 (23 self) - Add to MetaCart
Abstract not found

Simultaneous hardcore bits and cryptography against memory attacks

by Adi Akavia, Shafi Goldwasser, Vinod Vaikuntanathan - IN TCC , 2009
"... This paper considers two questions in cryptography. Cryptography Secure Against Memory Attacks. A particularly devastating side-channel attack against cryptosystems, termed the “memory attack”, was proposed recently. In this attack, a significant fraction of the bits of a secret key of a cryptograp ..."
Abstract - Cited by 116 (11 self) - Add to MetaCart
This paper considers two questions in cryptography. Cryptography Secure Against Memory Attacks. A particularly devastating side-channel attack against cryptosystems, termed the “memory attack”, was proposed recently. In this attack, a significant fraction of the bits of a secret key of a cryptographic algorithm can be measured by an adversary if the secret key is ever stored in a part of memory which can be accessed even after power has been turned off for a short amount of time. Such an attack has been shown to completely compromise the security of various cryptosystems in use, including the RSA cryptosystem and AES. We show that the public-key encryption scheme of Regev (STOC 2005), and the identity-based encryption scheme of Gentry, Peikert and Vaikuntanathan (STOC 2008) are remarkably robust against memory attacks where the adversary can measure a large fraction of the bits of the secret-key, or more generally, can compute an arbitrary function of the secret-key of bounded output length. This is done without increasing the size of the secret-key, and without introducing any
(Show Context)

Citation Context

...ource) satisfying the definition of [9] imply trapdoor functions with many simultaneous hardcore bits. Together with the construction of deterministic encryption schemes from lossy trapdoor functions =-=[36]-=- (based on DDH and LWE), this gives us trapdoor functions based on DDH and LWE with many simultaneous hardcore bits. However, it seems that using this approach applied to the LWE instantiation, it is ...

Better key sizes (and attacks) for LWE-based encryption

by Richard Lindner, Chris Peikert - In CT-RSA , 2011
"... We analyze the concrete security and key sizes of theoretically sound lattice-based encryption schemes based on the “learning with errors ” (LWE) problem. Our main contributions are: (1) a new lattice attack on LWE that combines basis reduction with an enumeration algorithm admitting a time/success ..."
Abstract - Cited by 71 (7 self) - Add to MetaCart
We analyze the concrete security and key sizes of theoretically sound lattice-based encryption schemes based on the “learning with errors ” (LWE) problem. Our main contributions are: (1) a new lattice attack on LWE that combines basis reduction with an enumeration algorithm admitting a time/success tradeoff, which performs better than the simple distinguishing attack considered in prior analyses; (2) concrete parameters and security estimates for an LWE-based cryptosystem that is more compact and efficient than the well-known schemes from the literature. Our new key sizes are up to 10 times smaller than prior examples, while providing even stronger concrete security levels.

Lattice-based Cryptography

by Daniele Micciancio, Oded Regev , 2008
"... In this chapter we describe some of the recent progress in lattice-based cryptography. Lattice-based cryptographic constructions hold a great promise for post-quantum cryptography, as they enjoy very strong security proofs based on worst-case hardness, relatively efficient implementations, as well a ..."
Abstract - Cited by 66 (5 self) - Add to MetaCart
In this chapter we describe some of the recent progress in lattice-based cryptography. Lattice-based cryptographic constructions hold a great promise for post-quantum cryptography, as they enjoy very strong security proofs based on worst-case hardness, relatively efficient implementations, as well as great simplicity. In addition, lattice-based cryptography is believed to be secure against quantum computers. Our focus here
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University