Results 1  10
of
56
Privacy Preserving Keyword Searches on Remote Encrypted Data
, 2004
"... We consider the following problem: a user wants to store his files in an encrypted form on a remote file server S. ..."
Abstract

Cited by 147 (0 self)
 Add to MetaCart
We consider the following problem: a user wants to store his files in an encrypted form on a remote file server S.
The complexity of online memory checking
 In Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science
, 2005
"... We consider the problem of storing a large file on a remote and unreliable server. To verify that the file has not been corrupted, a user could store a small private (randomized) “fingerprint” on his own computer. This is the setting for the wellstudied authentication problem in cryptography, and t ..."
Abstract

Cited by 54 (3 self)
 Add to MetaCart
(Show Context)
We consider the problem of storing a large file on a remote and unreliable server. To verify that the file has not been corrupted, a user could store a small private (randomized) “fingerprint” on his own computer. This is the setting for the wellstudied authentication problem in cryptography, and the required fingerprint size is well understood. We study the problem of sublinear authentication: suppose the user would like to encode and store the file in a way that allows him to verify that it has not been corrupted, but without reading the entire file. If the user only wants to read q bits of the file, how large does the size s of the private fingerprint need to be? We define this problem formally, and show a tight lower bound on the relationship between s and q when the adversary is not computationally bounded, namely: s × q = Ω(n), where n is the file size. This is an easier case of the online memory checking problem, introduced by Blum et al. in 1991, and hence the same (tight) lower bound applies also to that problem. It was previously shown that when the adversary is computationally bounded, under the assumption that oneway functions exist, it is possible to construct much better online memory checkers. T he same is also true for sublinear authentication schemes. We show that the existence of oneway functions is also a necessary condition: even slightly breaking the s × q = Ω(n) lower bound in a computational setting implies the existence of oneway functions. 1
A survey on private information retrieval
 Bulletin of the EATCS
, 2004
"... Alice wants to query a database but she does not want the database to learn what she is querying. She can ask for the entire database. Can she get her query answered with less communication? One model of this problem is Private Information Retrieval, henceforth PIR. We survey results obtained about ..."
Abstract

Cited by 46 (1 self)
 Add to MetaCart
(Show Context)
Alice wants to query a database but she does not want the database to learn what she is querying. She can ask for the entire database. Can she get her query answered with less communication? One model of this problem is Private Information Retrieval, henceforth PIR. We survey results obtained about the PIR model including partial answers to the following questions. (1) What if there are k noncommunicating copies of the database but they are computationally unbounded? (2) What if there is only one copy of the database and it is computationally bounded? 1
Single Database Private Information Retrieval with Logarithmic Communication
, 2004
"... In this paper, we study the problem of single database private information retrieval, and present schemes with only logarithmic serverside communication complexity. Previously the best result could only achieve polylogarithmic communication, and was based on certain less wellstudied assumptions ..."
Abstract

Cited by 45 (0 self)
 Add to MetaCart
(Show Context)
In this paper, we study the problem of single database private information retrieval, and present schemes with only logarithmic serverside communication complexity. Previously the best result could only achieve polylogarithmic communication, and was based on certain less wellstudied assumptions in number theory [CMS99]. On the contrary, our construction is based on Paillier's cryptosystem [P99], which along with its variants have drawn extensive studies in recent cryptographic researches [PP99, G00, CGGN01, DJ01, CGG02, CNS02, ST02, GMMV03, KT03], and have many important applications (e.g., the CramerShoup CCA2 encryption scheme in the standard model [CS02]).
On the compressibility of NP instances and cryptographic applications
 In Electronic Colloquium on Computational Complexity (ECCC
, 2006
"... We initiate the study of compression that preserves the solution to an instance of a problem rather than preserving the instance itself. Our focus is on the compressibility of NP decision problems. We consider NP problems that have long instances but relatively short witnesses. The question is, can ..."
Abstract

Cited by 41 (1 self)
 Add to MetaCart
(Show Context)
We initiate the study of compression that preserves the solution to an instance of a problem rather than preserving the instance itself. Our focus is on the compressibility of NP decision problems. We consider NP problems that have long instances but relatively short witnesses. The question is, can one efficiently compress an instance and store a shorter representation that maintains the information of whether the original input is in the language or not. We want the length of the compressed instance to be polynomial in the length of the witness rather than the length of original input. Such compression enables to succinctly store instances until a future setting will allow solving them, either via a technological or algorithmic breakthrough or simply until enough time has elapsed. We give a new classification of NP with respect to compression. This classification forms a stratification of NP that we call the VC hierarchy. The hierarchy is based on a new type of reduction called Wreduction and there are compressioncomplete problems for each class. Our motivation for studying this issue stems from the vast cryptographic implications compressibility has. For example, we say that SAT is compressible if there exists a polynomial p(·, ·) so that given a formula consisting of m clauses over n variables it is possible to come up with an equivalent (w.r.t satisfiability) formula of size at most p(n, logm). Then given a compression algorithm for SAT we provide a construction of collision resistant hash functions from any oneway function. This task was shown to be impossible via blackbox reductions [57], and indeed the construction presented is inherently nonblackbox. Another application of SAT compressibility is a cryptanalytic result concerning the limitation of everlasting security in the bounded storage model when mixed with (time) complexity based cryptography. In addition, we study an approach to constructing an Oblivious Transfer Protocol from any oneway function. This approach is based on compression for SAT that also has a property that we call witness retrievability. However, we mange to prove severe limitations on the ability to achieve witness retrievable compression of SAT. 1
Secure outsourcing of dna searching via finite automata
, 2010
"... Abstract. This work treats the problem of errorresilient DNA searching via oblivious evaluation of finite automata, where a client has a DNA sequence, and a service provider has a pattern that corresponds to a genetic test. Errorresilient searching is achieved by representing the pattern as a fini ..."
Abstract

Cited by 26 (5 self)
 Add to MetaCart
(Show Context)
Abstract. This work treats the problem of errorresilient DNA searching via oblivious evaluation of finite automata, where a client has a DNA sequence, and a service provider has a pattern that corresponds to a genetic test. Errorresilient searching is achieved by representing the pattern as a finite automaton and evaluating it on the DNA sequence (which is treated as the input), where privacy of both the pattern and the DNA sequence must be preserved. Interactive solutions to this problem already exist, but can be a burden on the participating parties. Thus, in this work we propose techniques for secure outsourcing of oblivious evaluation of finite automata to computational servers, such that the servers do not learn any information. Our techniques are applicable to any type of finite automata, but the optimizations are tailored to the setting of DNA searching. 1
Semihonest to malicious oblivious transfer  the blackbox way
 In TCC
, 2008
"... Abstract. Until recently, all known constructions of oblivious transfer protocols based on general hardness assumptions had the following form. First, the hardness assumption is used in a blackbox manner (i.e., the construction uses only the input/output behavior of the primitive guaranteed by the ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Until recently, all known constructions of oblivious transfer protocols based on general hardness assumptions had the following form. First, the hardness assumption is used in a blackbox manner (i.e., the construction uses only the input/output behavior of the primitive guaranteed by the assumption) to construct a semihonest oblivious transfer, a protocol whose security is guaranteed to hold only against adversaries that follow the prescribed protocol. Then, the latter protocol is “compiled” into a (malicious) oblivious transfer using nonblack techniques (a Karp reduction is carried in order to prove an NP statement in zeroknowledge). In their recent breakthrough result, Ishai, Kushilevitz, Lindel and Petrank (STOC ’06) deviated from the above paradigm, presenting a blackbox reduction from oblivious transfer to enhanced trapdoor permutations and to homomorphic encryption. Here we generalize their result, presenting a blackbox reduction from oblivious transfer to semihonest oblivious transfer. Consequently, oblivious transfer can be blackbox reduced to each of the hardness assumptions known to imply a semihonest oblivious transfer in a blackbox manner. This list currently includes beside the hardness assumptions used by Ishai et al., also the existence of families of dense trapdoor permutations and of non trivial singleserver private information retrieval. 1
Batch Codes and Their Applications
, 2004
"... A batch code encodes a string x into an mtuple of strings, called buckets, such that each batch of k bits from x can be decoded by reading at most one (more generally, t) bits from each bucket. Batch codes can be viewed as relaxing several combinatorial objects, including expanders and locally deco ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
(Show Context)
A batch code encodes a string x into an mtuple of strings, called buckets, such that each batch of k bits from x can be decoded by reading at most one (more generally, t) bits from each bucket. Batch codes can be viewed as relaxing several combinatorial objects, including expanders and locally decodable codes.
Lossy encryption: Constructions from general assumptions and efficient selective opening chosen ciphertext security
, 2009
"... In this paper, we present new and general constructions of lossy encryption schemes. By applying results from Eurocrypt ’09, we obtain new general constructions of cryptosystems secure against a Selective Opening Adversaries (SOA). Although it was recognized almost twenty years ago that SOA security ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
In this paper, we present new and general constructions of lossy encryption schemes. By applying results from Eurocrypt ’09, we obtain new general constructions of cryptosystems secure against a Selective Opening Adversaries (SOA). Although it was recognized almost twenty years ago that SOA security was important, it was not until the recent breakthrough works of Hofheinz and Bellare, Hofheinz and Yilek that any progress was made on this fundamental problem. The Selective Opening problem is as follows: suppose an adversary receives n commitments (or encryptions) of (possibly) correlated messages, and now the adversary can choose n/2 of the messages, and receive decommitments (or decryptions and the randomness used to encrypt them). Do the unopened commitments (encryptions) remain secure? A protocol achieving this type of security is called secure against a selective opening adversary (SOA). This question arises naturally in the context of Byzantine Agreement and Secure Multiparty Computation, where an active adversary is able to eavesdrop on all the wires, and then choose a subset of players to corrupt. Unfortunately, the traditional definitions of security (INDCPA, INDCCA) do not guarantee security in this setting. In this paper: • We formally define rerandomizable encryption and show that every rerandomizable encryption
On Generating the Initial Key in the BoundedStorage Model
 In Advances in Cryptology — EUROCRYPT 2004
, 2004
"... Abstract. In the boundedstorage model (BSM) for informationtheoretically secure encryption and keyagreement one uses a random string R whose length t is greater than the assumed bound s on the adversary Eve’s storage capacity. The legitimate parties Alice and Bob share a short initial secret key ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
(Show Context)
Abstract. In the boundedstorage model (BSM) for informationtheoretically secure encryption and keyagreement one uses a random string R whose length t is greater than the assumed bound s on the adversary Eve’s storage capacity. The legitimate parties Alice and Bob share a short initial secret key K which they use to select and combine certain bits of R to obtain a derived key X which is much longer than K. Eve can be proved to obtain essentially no information about X even if she has infinite computing power and even if she learns K after having performed the storage operation and lost access to R. This paper addresses the problem of generating the initial key K and makes two contributions. First, we prove that without such a key, secret key agreement in the BSM is impossible unless Alice and Bob have themselves very high storage capacity, thus proving the optimality of a scheme proposed by Cachin and Maurer. Second, we investigate the hybrid model where K is generated by a computationally secure key