Results 1  10
of
18
Lossy Trapdoor Functions and Their Applications
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY, REPORT NO. 80 (2007)
, 2007
"... We propose a new general primitive called lossy trapdoor functions (lossy TDFs), and realize it under a variety of different number theoretic assumptions, including hardness of the decisional DiffieHellman (DDH) problem and the worstcase hardness of standard lattice problems. Using lossy TDFs, we ..."
Abstract

Cited by 92 (19 self)
 Add to MetaCart
We propose a new general primitive called lossy trapdoor functions (lossy TDFs), and realize it under a variety of different number theoretic assumptions, including hardness of the decisional DiffieHellman (DDH) problem and the worstcase hardness of standard lattice problems. Using lossy TDFs, we develop a new approach for constructing many important cryptographic primitives, including standard trapdoor functions, CCAsecure cryptosystems, collisionresistant hash functions, and more. All of our constructions are simple, efficient, and blackbox. Taken all together, these results resolve some longstanding open problems in cryptography. They give the first known (injective) trapdoor functions based on problems not directly related to integer factorization, and provide the first known CCAsecure cryptosystem based solely on worstcase lattice assumptions.
On ideal lattices and learning with errors over rings
 In Proc. of EUROCRYPT, volume 6110 of LNCS
, 2010
"... The “learning with errors ” (LWE) problem is to distinguish random linear equations, which have been perturbed by a small amount of noise, from truly uniform ones. The problem has been shown to be as hard as worstcase lattice problems, and in recent years it has served as the foundation for a pleth ..."
Abstract

Cited by 46 (8 self)
 Add to MetaCart
The “learning with errors ” (LWE) problem is to distinguish random linear equations, which have been perturbed by a small amount of noise, from truly uniform ones. The problem has been shown to be as hard as worstcase lattice problems, and in recent years it has served as the foundation for a plethora of cryptographic applications. Unfortunately, these applications are rather inefficient due to an inherent quadratic overhead in the use of LWE. A main open question was whether LWE and its applications could be made truly efficient by exploiting extra algebraic structure, as was done for latticebased hash functions (and related primitives). We resolve this question in the affirmative by introducing an algebraic variant of LWE called ringLWE, and proving that it too enjoys very strong hardness guarantees. Specifically, we show that the ringLWE distribution is pseudorandom, assuming that worstcase problems on ideal lattices are hard for polynomialtime quantum algorithms. Applications include the first truly practical latticebased publickey cryptosystem with an efficient security reduction; moreover, many of the other applications of LWE can be made much more efficient through the use of ringLWE. 1
Privacy preserving data mining
, 2007
"... Privacy preserving data mining (PPDM) refers to the area of data mining that seeks to safeguard sensitive information from unsolicited or unsanctioned disclosure. Most traditional data mining techniques analyze and model the dataset statistically, in aggregation, while privacy preservation is primar ..."
Abstract

Cited by 31 (0 self)
 Add to MetaCart
Privacy preserving data mining (PPDM) refers to the area of data mining that seeks to safeguard sensitive information from unsolicited or unsanctioned disclosure. Most traditional data mining techniques analyze and model the dataset statistically, in aggregation, while privacy preservation is primarily concerned with protecting against
Turing Machines, Transition Systems, and Interaction
 Information and Computation
, 2004
"... We present Persistent Turing Machines (PTMs), a new way of interpreting Turingmachine computation, one that is both interactive and persistent. A PTM repeatedly receives an input token from the environment, computes for a while, and then outputs the result. Moreover, it can \remember" its p ..."
Abstract

Cited by 28 (4 self)
 Add to MetaCart
We present Persistent Turing Machines (PTMs), a new way of interpreting Turingmachine computation, one that is both interactive and persistent. A PTM repeatedly receives an input token from the environment, computes for a while, and then outputs the result. Moreover, it can \remember" its previous state (worktape contents) upon commencing a new computation. We show that the class of PTMs is isomorphic to a very general class of eective transition systems, thereby allowing one to view PTMs as transition systems \in disguise." The persistent stream language (PSL) of a PTM is a coinductively dened set of interaction streams : innite sequences of pairs of the form (w i ; w o ), recording, for each interaction with the environment, the input token received by the PTM and the corresponding output token. We dene an innite hierarchy of successively ner equivalences for PTMs over nite interactionstream prexes and show that the limit of this hierarchy does not coincide with PSLequivalence. The presence of this \gap" can be attributed to the fact that the transition systems corresponding to PTM computations naturally exhibit unbounded nondeterminism. We also consider amnesic PTMs, where each new computation begins with a blank work tape, and a corresponding notion of equivalence based on amnesic stream languages (ASLs). We show that the class of ASLs is strictly contained in the class of PSLs. Amnesic stream languages are representative of the classical view of Turingmachine computation. One may consequently conclude that, in a streambased setting, the extension of the Turingmachine model with persistence is a nontrivial one, and provides a formal foundation for reasoning about programming concepts such as objects with static elds. We additional...
Towards robustness in query auditing
 In VLDB
, 2006
"... We consider the online query auditing problem for statistical databases. Given a stream of aggregate queries posed over sensitive data, when should queries be denied in order to protect the privacy of individuals? We construct efficient auditors for max queries and bags of max and min queries in bot ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
(Show Context)
We consider the online query auditing problem for statistical databases. Given a stream of aggregate queries posed over sensitive data, when should queries be denied in order to protect the privacy of individuals? We construct efficient auditors for max queries and bags of max and min queries in both the partial and full disclosure settings. Our algorithm for the partial disclosure setting involves a novel application of probabilistic inference techniques that may be of independent interest. We also study for the first time, a particular dimension of the utility of an auditing scheme and obtain initial results for the utility of sum auditing when guarding against full disclosure. The result is positive for large databases, indicating that answers to queries will not be riddled with denials. 1.
Interactive Hashing and reductions between Oblivious Transfer variants
"... Interactive Hashing has featured as an essential ingredient in protocols realizing a large variety of cryptographic tasks. We present a study of this important cryptographic tool in the informationtheoretic context. We start by presenting a security definition which is independent of any particular ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Interactive Hashing has featured as an essential ingredient in protocols realizing a large variety of cryptographic tasks. We present a study of this important cryptographic tool in the informationtheoretic context. We start by presenting a security definition which is independent of any particular setting or application. We then show that a standard implementation of Interactive Hashing satisfies all the conditions of our definition. Our proof of security improves upon previous ones in several ways. Despite its generality, it is considerably simpler. Moreover, it establishes a tighter upper bound on the cheating probability of a dishonest sender. Specifically, we prove that if the fraction of good strings for a dishonest sender is f, then the probability that both outputs will be good is no larger than 15:6805 f. This upper bound is valid for any f and is tight up to a small constant since a sender acting honestly would get two good outputs with probability very close to f. We illustrate the potential of Interactive Hashing as a cryptographic primitive by demonstrating efficient reductions of String Oblivious Transfer with string length k to Bit Oblivious Transfer and several weaker variants. Our reductions incorporate tests based on Interactive Hashing that allow the sender to verify the receiver’s adherence to the protocol without compromising the latter’s privacy. This allows a much more efficient use of the available entropy without any appreciable impact on security. As a result, for Bit OT and most of its variants n = (1 +)k executions suffice, improving efficiency by a factor of two or more compared to the most efficient reductions that do not use Interactive Hashing. As it is theoretically impossible to achieve an expansion factor n=k smaller than 1, our reductions are in fact asymptotically optimal. They are also more general since they place no restrictions on the types of 2universal hash families used for Privacy Amplification. Lastly, we present a direct reduction of String OT to Rabin OT which uses similar methods to achieve an expansion factor of 2 + which is again asymptotically optimal.
On the Provable Security of an Efficient RSABased Pseudorandom Generator
, 2006
"... Pseudorandom Generators (PRGs) based on the RSA inversion (onewayness) problem have been extensively studied in the literature over the last 25 years. These generators have the attractive feature of provable pseudorandomness security assuming the hardness of the RSA inversion problem. However, de ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
Pseudorandom Generators (PRGs) based on the RSA inversion (onewayness) problem have been extensively studied in the literature over the last 25 years. These generators have the attractive feature of provable pseudorandomness security assuming the hardness of the RSA inversion problem. However, despite extensive study, the most efficient provably secure RSAbased generators output asymptotically only at most O(log n) bits per multiply modulo an RSA modulus of bitlength n, and hence are too slow to be used in many practical applications. To bring theory closer to practice, we present a simple modification to the proof of security by Fischlin and Schnorr of an RSAbased PRG, which shows that one can obtain an RSAbased PRG which outputs Ω(n) bits per multiply and has provable pseudorandomness security assuming the hardness of a wellstudied variant of the RSA inversion problem, where a constant fraction of the plaintext bits are given. Our result gives a positive answer to an open question posed by Gennaro (J. of Cryptology, 2005) regarding finding a PRG beating the rate O(log n) bits per multiply at the cost of a reasonable assumption on RSA inversion.
Authenticating Aggregate Range Queries over Multidimensional Dataset
"... We are interested in the integrity of the query results from an outsourced database service provider. Alice passes a set D of ddimensional points, together with some authentication tag T, to an untrusted service provider Bob. Later, Alice issues some query over D to Bob, and Bob should produce a qu ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
We are interested in the integrity of the query results from an outsourced database service provider. Alice passes a set D of ddimensional points, together with some authentication tag T, to an untrusted service provider Bob. Later, Alice issues some query over D to Bob, and Bob should produce a query result and a proof based on D and T. Alice wants to verify the integrity of the query result with the help of the proof, using only the private key. In this paper, we consider aggregate query conditional on multidimensional range selection. In its basic form, a query asks for the total number of data points within a ddimensional range. We are concerned about the number of communication bits required and the size of the tag T. We give a method that requires O(d 2) communication bits to authenticate an aggregate query conditional on ddimensional range selection. Besides counting, summing and finding of the minimum can also be supported. Furthermore, our scheme can be extended slightly to authenticate ddimensional usual (nonaggregate) range selection query with O(d 2) bits communication overhead, improving known results that require O(log d−1 N) communication overhead, where N is the number of data points in the dataset.
Efficient Cryptosystems From 2^kth Power Residue Symbols
, 2013
"... Goldwasser and Micali (1984) highlighted the importance of randomizing the plaintext for publickey encryption and introduced the notion of semantic security. They also realized a cryptosystem meeting this security notion under the standard complexity assumption of deciding quadratic residuosity mo ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Goldwasser and Micali (1984) highlighted the importance of randomizing the plaintext for publickey encryption and introduced the notion of semantic security. They also realized a cryptosystem meeting this security notion under the standard complexity assumption of deciding quadratic residuosity modulo a composite number. The GoldwasserMicali cryptosystem is simple and elegant but is quite wasteful in bandwidth when encrypting large messages. A number of works followed to address this issue and proposed various modifications. This paper revisits the original GoldwasserMicali cryptosystem using 2 kth power residue symbols. The soobtained cryptosystems appear as a very natural generalization for k ≥ 2 (the case k = 1 corresponds exactly to the GoldwasserMicali cryptosystem). Advantageously, they are efficient in both bandwidth and speed; in particular, they allow for fast decryption. Further, the cryptosystems described in this paper inherit the useful features of the original cryptosystem (like its homomorphic property) and are shown to be secure under a similar complexity assumption. As a prominent application, this paper describes an efficient lossy trapdoor function based thereon.
A NOTE ON A YAO’S THEOREM ABOUT PSEUDORANDOM GENERATORS
"... Abstract. The Yao’s theorem gives an equivalence between the indistinguishability of a pseudorandom generator and the impredictability of the next bit from an asymptotic point of view. We present in this paper, with detailed proofs, some modified versions of the Yao’s theorem which can be of interes ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. The Yao’s theorem gives an equivalence between the indistinguishability of a pseudorandom generator and the impredictability of the next bit from an asymptotic point of view. We present in this paper, with detailed proofs, some modified versions of the Yao’s theorem which can be of interest for the study of practical systems. We study the case of one pseudorandom generator, then the case of a family of pseudorandom generators having the same fixed length and last an asymptotical version of the previous result. We compute in each case the cost of the reduction between the two algorithms. 1.