Results 1  10
of
29
Lossy Trapdoor Functions and Their Applications
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY, REPORT NO. 80 (2007)
, 2007
"... We propose a new general primitive called lossy trapdoor functions (lossy TDFs), and realize it under a variety of different number theoretic assumptions, including hardness of the decisional DiffieHellman (DDH) problem and the worstcase hardness of standard lattice problems. Using lossy TDFs, we ..."
Abstract

Cited by 125 (21 self)
 Add to MetaCart
(Show Context)
We propose a new general primitive called lossy trapdoor functions (lossy TDFs), and realize it under a variety of different number theoretic assumptions, including hardness of the decisional DiffieHellman (DDH) problem and the worstcase hardness of standard lattice problems. Using lossy TDFs, we develop a new approach for constructing many important cryptographic primitives, including standard trapdoor functions, CCAsecure cryptosystems, collisionresistant hash functions, and more. All of our constructions are simple, efficient, and blackbox. Taken all together, these results resolve some longstanding open problems in cryptography. They give the first known (injective) trapdoor functions based on problems not directly related to integer factorization, and provide the first known CCAsecure cryptosystem based solely on worstcase lattice assumptions.
PublicKey Cryptosystems Resilient to Key Leakage
"... Most of the work in the analysis of cryptographic schemes is concentrated in abstract adversarial models that do not capture sidechannel attacks. Such attacks exploit various forms of unintended information leakage, which is inherent to almost all physical implementations. Inspired by recent sidec ..."
Abstract

Cited by 89 (6 self)
 Add to MetaCart
(Show Context)
Most of the work in the analysis of cryptographic schemes is concentrated in abstract adversarial models that do not capture sidechannel attacks. Such attacks exploit various forms of unintended information leakage, which is inherent to almost all physical implementations. Inspired by recent sidechannel attacks, especially the “cold boot attacks ” of Halderman et al. (USENIX Security ’08), Akavia, Goldwasser and Vaikuntanathan (TCC ’09) formalized a realistic framework for modeling the security of encryption schemes against a wide class of sidechannel attacks in which adversarially chosen functions of the secret key are leaked. In the setting of publickey encryption, Akavia et al. showed that Regev’s latticebased scheme (STOC ’05) is resilient to any leakage of
Cryptography in NC0
, 2006
"... We study the parallel timecomplexity of basic cryptographic primitives such as oneway functions (OWFs) and pseudorandom generators (PRGs). Specifically, we study the possibility of implementing instances of these primitives by NC 0 functions, namely by functions in which each output bit depends on ..."
Abstract

Cited by 47 (11 self)
 Add to MetaCart
We study the parallel timecomplexity of basic cryptographic primitives such as oneway functions (OWFs) and pseudorandom generators (PRGs). Specifically, we study the possibility of implementing instances of these primitives by NC 0 functions, namely by functions in which each output bit depends on a constant number of input bits. Despite previous efforts in this direction, there has been no convincing theoretical evidence supporting this possibility, which was posed as an open question in several previous works. We essentially settle this question by providing strong positive evidence for the possibility of cryptography in NC 0. Our main result is that every “moderately easy ” OWF (resp., PRG), say computable in NC 1, can be compiled into a corresponding OWF (resp., “lowstretch ” PRG) in which each output bit depends on at most 4 input bits. The existence of OWF and PRG in NC 1 is a relatively mild assumption, implied by most numbertheoretic or algebraic intractability assumptions commonly used in cryptography. A similar compiler can also be obtained for other cryptographic primitives such as oneway permutations, encryption, signatures, commitment, and collisionresistant hashing. Our techniques can also be applied to obtain (unconditional) constructions of “noncryptographic ” PRGs. In particular, we obtain ɛbiased generators and a PRG for spacebounded computation in which each output bit depends on only 3 input bits. Our results make use of the machinery of randomizing polynomials (Ishai and Kushilevitz, 41st FOCS, 2000), which was originally motivated by questions in the domain of informationtheoretic secure multiparty computation. 1
ChosenCiphertext Security via Correlated Products
"... We initiate the study of onewayness under correlated products. We are interested in identifying necessary and sufficient conditions for a function f and a distribution on inputs (x1,..., xk), so that the function (f(x1),..., f(xk)) is oneway. The main motivation of this study is the construction o ..."
Abstract

Cited by 42 (4 self)
 Add to MetaCart
(Show Context)
We initiate the study of onewayness under correlated products. We are interested in identifying necessary and sufficient conditions for a function f and a distribution on inputs (x1,..., xk), so that the function (f(x1),..., f(xk)) is oneway. The main motivation of this study is the construction of publickey encryption schemes that are secure against chosenciphertext attacks (CCA). We show that any collection of injective trapdoor functions that is secure under very natural correlated products can be used to construct a CCAsecure publickey encryption scheme. The construction is simple, blackbox, and admits a direct proof of security. We provide evidence that security under correlated products is achievable by demonstrating that any collection of lossy trapdoor functions, a powerful primitive introduced by Peikert and Waters (STOC ’08), yields a collection of injective trapdoor functions that is secure under the above mentioned natural correlated products. Although we eventually base security under correlated products on lossy trapdoor functions, we argue that the former notion is potentially weaker as a general assumption. Specifically, there is no fullyblackbox construction of lossy trapdoor functions from trapdoor functions that are secure under correlated products.
On the compressibility of NP instances and cryptographic applications
 In Electronic Colloquium on Computational Complexity (ECCC
, 2006
"... We initiate the study of compression that preserves the solution to an instance of a problem rather than preserving the instance itself. Our focus is on the compressibility of NP decision problems. We consider NP problems that have long instances but relatively short witnesses. The question is, can ..."
Abstract

Cited by 41 (1 self)
 Add to MetaCart
(Show Context)
We initiate the study of compression that preserves the solution to an instance of a problem rather than preserving the instance itself. Our focus is on the compressibility of NP decision problems. We consider NP problems that have long instances but relatively short witnesses. The question is, can one efficiently compress an instance and store a shorter representation that maintains the information of whether the original input is in the language or not. We want the length of the compressed instance to be polynomial in the length of the witness rather than the length of original input. Such compression enables to succinctly store instances until a future setting will allow solving them, either via a technological or algorithmic breakthrough or simply until enough time has elapsed. We give a new classification of NP with respect to compression. This classification forms a stratification of NP that we call the VC hierarchy. The hierarchy is based on a new type of reduction called Wreduction and there are compressioncomplete problems for each class. Our motivation for studying this issue stems from the vast cryptographic implications compressibility has. For example, we say that SAT is compressible if there exists a polynomial p(·, ·) so that given a formula consisting of m clauses over n variables it is possible to come up with an equivalent (w.r.t satisfiability) formula of size at most p(n, logm). Then given a compression algorithm for SAT we provide a construction of collision resistant hash functions from any oneway function. This task was shown to be impossible via blackbox reductions [57], and indeed the construction presented is inherently nonblackbox. Another application of SAT compressibility is a cryptanalytic result concerning the limitation of everlasting security in the bounded storage model when mixed with (time) complexity based cryptography. In addition, we study an approach to constructing an Oblivious Transfer Protocol from any oneway function. This approach is based on compression for SAT that also has a property that we call witness retrievability. However, we mange to prove severe limitations on the ability to achieve witness retrievable compression of SAT. 1
Signature schemes with bounded leakage resilience
 In ASIACRYPT
, 2009
"... A leakageresilient cryptosystem remains secure even if arbitrary, but bounded, information about the secret key (or possibly other internal state information) is leaked to an adversary. Denote the length of the secret key by n. We show a signature scheme tolerating (optimal) leakage of up to n − nǫ ..."
Abstract

Cited by 40 (1 self)
 Add to MetaCart
(Show Context)
A leakageresilient cryptosystem remains secure even if arbitrary, but bounded, information about the secret key (or possibly other internal state information) is leaked to an adversary. Denote the length of the secret key by n. We show a signature scheme tolerating (optimal) leakage of up to n − nǫ bits of information about the secret key, and a more efficient onetime signature scheme that tolerates leakage of ( 1 4 −ǫ) ·n bits of information about the signer’s entire state. The latter construction extends to give a leakageresilient ttime signature scheme. All these constructions are in the standard model under general assumptions. 1
Fully LeakageResilient Signatures
, 2010
"... A signature scheme is fully leakage resilient (Katz and Vaikuntanathan, ASIACRYPT ’09) if it is existentially unforgeable under an adaptive chosenmessage attack even in a setting where an adversary may obtain bounded (yet arbitrary) leakage information on all intermediate values that are used throu ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
A signature scheme is fully leakage resilient (Katz and Vaikuntanathan, ASIACRYPT ’09) if it is existentially unforgeable under an adaptive chosenmessage attack even in a setting where an adversary may obtain bounded (yet arbitrary) leakage information on all intermediate values that are used throughout the lifetime of the system. This is a strong and meaningful notion of security that captures a wide range of sidechannel attacks. One of the main challenges in constructing fully leakageresilient signature schemes is dealing with leakage that may depend on the random bits used by the signing algorithm, and constructions of such schemes are known only in the randomoracle model. Moreover, even in the randomoracle model, known schemes are only resilient to leakage of less than half the length of their signing key. In this paper we construct the first fully leakageresilient signature schemes without random oracles. We present a scheme that is resilient to any leakage of length (1 − o(1))L bits, where L is the length of the signing key. Our approach relies on generic cryptographic primitives, and at the same time admits rather efficient instantiations based on specific numbertheoretic
Sufficient Conditions for CollisionResistant Hashing
 In Proceedings of the 2nd Theory of Cryptography Conference
, 2005
"... Abstract. We present several new constructions of collisionresistant hashfunctions (CRHFs) from general assumptions. We start with a simple construction of CRHF from any homomorphic encryption. Then, we strengthen this result by presenting constructions of CRHF from two other primitives that are i ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We present several new constructions of collisionresistant hashfunctions (CRHFs) from general assumptions. We start with a simple construction of CRHF from any homomorphic encryption. Then, we strengthen this result by presenting constructions of CRHF from two other primitives that are implied by homomorphicencryption: oneround private information retrieval (PIR) protocols and homomorphic oneway commitments. Keywords. Collisionresistant hash functions, homomorphic encryption, private informationretrieval. 1 Introduction Collision resistant hashfunctions (CRHFs) are an important cryptographic primitive. Their applications range from classic ones such as the &quot;hashandsign &quot; paradigm for signatures, via efficient (zeroknowledge) arguments [14, 17, 2], tomore recent applications such as ones relying on the nonblackbox techniques of [1]. In light of the importance of the CRHF primitive, it is natural to study itsrelations with other primitives and try to construct it from the most general
Limits of extractability assumptions with distributional auxiliary input
, 2013
"... Extractability, or “knowledge,” assumptions (such as the “knowledgeofexponent” assumption) have recently gained popularity in the cryptographic community—leading to the study of primitives such as extractable oneway functions, extractable hash functions, succinct noninteractive arguments of kno ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
(Show Context)
Extractability, or “knowledge,” assumptions (such as the “knowledgeofexponent” assumption) have recently gained popularity in the cryptographic community—leading to the study of primitives such as extractable oneway functions, extractable hash functions, succinct noninteractive arguments of knowledge (SNARKs), and extractable obfuscation, and spurring the development of a wide spectrum of new applications relying on these primitives. For most of these applications, it is required that the extractability assumption holds even in the presence of attackers receiving some auxiliary information that is sampled from some fixed efficiently computable distribution Z. We show that, assuming the existence of collisionresistant hash functions, there exists a pair of efficient distributions Z,Z ′ such that either • extractable oneway functions w.r.t. Z do not exist, or • extractability obfuscations for Turing machines w.r.t. Z ′ do not exist. A corollary of this result shows that assuming existence of fully homomorphic encryption with decryption in NC1, there exist efficient distributions Z,Z ′ such that either • extractability obfuscations for NC1 wr.t. Z do not exist, or • SNARKs for NP w.r.t. Z ′ do not exist. To achieve our results, we develop a “succinct punctured program ” technique, mirroring the powerful “punctured program ” technique of Sahai and Waters (ePrint’13), and present several other applications of this new technique.
Amplifying Collision Resistance: A ComplexityTheoretic Treatment
 Advances in Cryptology — Crypto 2007, Volume 4622 of Lecture
"... Abstract. We initiate a complexitytheoretic treatment of hardness amplification for collisionresistant hash functions, namely the transformation of weakly collisionresistant hash functions into strongly collisionresistant ones in the standard model of computation. We measure the level of collisi ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We initiate a complexitytheoretic treatment of hardness amplification for collisionresistant hash functions, namely the transformation of weakly collisionresistant hash functions into strongly collisionresistant ones in the standard model of computation. We measure the level of collision resistance by the maximum probability, over the choice of the key, for which an efficient adversary can find a collision. The goal is to obtain constructions with short output, short keys, small loss in adversarial complexity tolerated, and a good tradeoff between compression ratio and computational complexity. We provide an analysis of several simple constructions, and show that many of the parameters achieved by our constructions are almost optimal in some sense.