Results 1  10
of
15
Efficient Arguments without Short PCPs
"... Current constructions of efficient argument systems combine a short (polynomial size) PCP with a cryptographic hashing technique. We suggest an alternative approach for this problem that allows to simplify the underlying PCP machinery using a stronger cryptographic technique. More concretely, we pre ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Current constructions of efficient argument systems combine a short (polynomial size) PCP with a cryptographic hashing technique. We suggest an alternative approach for this problem that allows to simplify the underlying PCP machinery using a stronger cryptographic technique. More concretely, we present a direct method for compiling an exponentially long PCP which is succinctly described by a linear oracle function π: F n → F into an argument system in which the verifier sends to the prover O(n) encrypted field elements and receives O(1) encryptions in return. This compiler can be based on an arbitrary homomorphic encryption scheme. Applying our general compiler to the exponential size Hadamard code based PCP of Arora et al. (JACM 1998) yields a simple argument system for NP in which the communication from the prover to the verifier only includes a constant number of short encryptions. The main tool we use is a new cryptographic primitive which allows to efficiently commit to a linear function and later open the output of the function on an arbitrary vector. Our efficient implementation of this primitive is independently motivated by cryptographic applications.
Computational bounds on hierarchical data processing with applications to information security
 In Proc. Int. Colloquium on Automata, Languages and Programming (ICALP), volume 3580 of LNCS
, 2005
"... Motivated by the study of algorithmic problems in the domain of information security, in this paper, we study the complexity of a new class of computations over a collection of values associated with a set of n elements. We introduce hierarchical data processing (HDP) problems which involve the comp ..."
Abstract

Cited by 18 (11 self)
 Add to MetaCart
Motivated by the study of algorithmic problems in the domain of information security, in this paper, we study the complexity of a new class of computations over a collection of values associated with a set of n elements. We introduce hierarchical data processing (HDP) problems which involve the computation of a collection of output values from an input set of n elements, where the entire computation is fully described by a directed acyclic graph (DAG). That is, individual computations are performed and intermediate values are processed according to the hierarchy induced by the DAG. We present an Ω(log n) lower bound on various computational cost measures for HDP problems. Essential in our study is an analogy that we draw between the complexities of any HDP problem of size n and searching by comparison in an order set of n elements, which shows an interesting connection between the two problems. In view of the logarithmic lower bounds, we also develop a new randomized DAG scheme for HDP problems that provides close to optimal performance and achieves cost measures with constant factors of the (logarithmic) leading asymptotic term that are close to optimal. Our lower bounds are general, apply to all HDP problems and, along with our new DAG construction, they provide an interesting –as well as useful in the area of algorithm analysis – theoretical framework. We apply our results to two information security problems, data authentication through cryptographic hashing and multicast key distribution using keygraphs and get a unified analysis and treatment for these problems. We show that both problems involve HDP and prove logarithmic lower bounds on their computational and communication costs. In particular, using our new DAG scheme, we present a new efficient authenticated dictionary with improved authentication overhead over previously known schemes. Moreover, through the relation between HDP and searching by comparison, we present a new skiplist version where the expected number of comparisons in a search is 1.25log 2 n + O(1). 1
ConstantSize Commitments to Polynomials and Their Applications
 In Proceedings of ASIACRYPT 2010
, 2010
"... Abstract. We introduce and formally define polynomial commitment schemes, and provide two efficient constructions. A polynomial commitment scheme allows a committer to commit to a polynomial with a short string that can be used by a verifier to confirm claimed evaluations of the committed polynomial ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
Abstract. We introduce and formally define polynomial commitment schemes, and provide two efficient constructions. A polynomial commitment scheme allows a committer to commit to a polynomial with a short string that can be used by a verifier to confirm claimed evaluations of the committed polynomial. Although the homomorphic commitment schemes in the literature can be used to achieve this goal, the sizes of their commitments are linear in the degree of the committed polynomial. On the other hand, polynomial commitments in our schemes are of constant size (single elements). The overhead of opening a commitment is also constant; even opening multiple evaluations requires only a constant amount of communication overhead. Therefore, our schemes are useful tools to reduce the communication cost in cryptographic protocols. On that front, we apply our polynomial commitment schemes to four problems in cryptography: verifiable secret sharing, zeroknowledge sets, credentials and content extraction signatures.
Protecting data privacy through hardtoreverse negative databases
 In Springer LNCS, editor, In proceedings of the 9th Information Security Conference (ISC’06
, 2006
"... Abstract. The paper extends the idea of negative representations of information for enhancing privacy. Simply put, a set DB of data elements can be represented in terms of its complement set. That is, all the elements not in DB are depicted and DB itself is not explicitly stored. We review the negat ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
Abstract. The paper extends the idea of negative representations of information for enhancing privacy. Simply put, a set DB of data elements can be represented in terms of its complement set. That is, all the elements not in DB are depicted and DB itself is not explicitly stored. We review the negative database (NDB) representation scheme for storing a negative image compactly and propose a design for depicting a multiple record DB using a collection of NDBs—in contrast to the single NDB approach of previous work. Finally, we present a method for creating negative databases that are hard to reverse in practice, i.e., from which it is hard to obtain DB, by adapting a technique for generating 3SAT formulas. 1
Indexing Information for Data Forensics
, 2005
"... We introduce novel techniques for organizing the indexing structures of how data is stored so that alterations from an original version can be detected and the changed values specifically identified. We give forensic constructions for several fundamental data structures, including arrays, linked li ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
We introduce novel techniques for organizing the indexing structures of how data is stored so that alterations from an original version can be detected and the changed values specifically identified. We give forensic constructions for several fundamental data structures, including arrays, linked lists, binary search trees, skip lists, and hash tables. Some of our constructions are based on a new reducedrandomness construction for nonadaptive combinatorial group testing.
Superefficient verification of dynamic outsourced databases
 IN RSA CONFERENCE—CRYPTO TRACK
, 2008
"... We develop new algorithmic and cryptographic techniques for authenticating the results of queries over databases that are outsourced to an untrusted responder. We depart from previous approaches by considering superefficient answer verification, where answers to queries are validated in time asymp ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
We develop new algorithmic and cryptographic techniques for authenticating the results of queries over databases that are outsourced to an untrusted responder. We depart from previous approaches by considering superefficient answer verification, where answers to queries are validated in time asymptotically less that the time spent to produce them and using lightweight cryptographic operations. We achieve this property by adopting the decoupling of query answering and answer verification in a way designed for queries related to range search. Our techniques allow for efficient updates of the database and protect against replay attacks performed by the responder. One such technique uses an offline audit mechanism: the data source and the user keep digests of the sequence of operations, yet are able to jointly audit the responder to determine if a replay attack has occurred since the last audit.
Statistically hiding sets
 In Proceedings of the The Cryptographers’ Track at the RSA Conference 2009, CTRSA 2009
, 2009
"... Zeroknowledge set is a primitive introduced by Micali, Rabin, and Kilian (FOCS 2003) which enables a prover to commit a set to a verifier, without revealing even the size of the set. Later the prover can give zeroknowledge proofs to convince the verifier of membership/nonmembership of elements in/ ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
Zeroknowledge set is a primitive introduced by Micali, Rabin, and Kilian (FOCS 2003) which enables a prover to commit a set to a verifier, without revealing even the size of the set. Later the prover can give zeroknowledge proofs to convince the verifier of membership/nonmembership of elements in/not in the committed set. We present a new primitive called Statistically Hiding Sets (SHS), similar to zeroknowledge sets, but providing an information theoretic hiding guarantee, rather than one based on efficient simulation. This is comparable to relaxing zeroknowledge proofs to witness independent proofs. More precisely, we continue to use the simulation paradigm for our definition, but do not require the simulator (nor the distinguisher) to be efficient. We present a new scheme for statistically hiding sets, which does not fit into the “Merkletree/mercurialcommitment” paradigm that has been used for all zeroknowledge set constructions so far. This not only provides efficiency gains compared to the best schemes in that paradigm, but also lets us provide statistical hiding; previous approaches required the prover to maintain growing amounts of state with each new proof for this.
Concise Mercurial Vector Commitments and Independent ZeroKnowledge Sets with Short Proofs
"... Abstract. Introduced by Micali, Rabin and Kilian (MRK), the basic primitive of zeroknowledge sets (ZKS) allows a prover to commit to a secret set S so as to be able to prove statements such as x ∈ S or x ̸ ∈ S. Chase et al. showed that ZKS protocols are underlain by a cryptographic primitive termed ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Abstract. Introduced by Micali, Rabin and Kilian (MRK), the basic primitive of zeroknowledge sets (ZKS) allows a prover to commit to a secret set S so as to be able to prove statements such as x ∈ S or x ̸ ∈ S. Chase et al. showed that ZKS protocols are underlain by a cryptographic primitive termed mercurial commitment. A (trapdoor) mercurial commitment has two commitment procedures. At committing time, the committer can choose not to commit to a specific message and rather generate a dummy value which it will be able to softly open to any message without being able to completely open it. Hard commitments, on the other hand, can be hardly or softly opened to only one specific message. At Eurocrypt 2008, Catalano, Fiore and Messina (CFM) introduced an extension called trapdoor qmercurial commitment (qTMC), which allows committing to a vector of q messages. These qTMC schemes are interesting since their openings w.r.t. specific vector positions can be short (ideally, the opening length should not depend on q), which provides zeroknowledge sets with much shorter proofs when such a commitment is combined with a Merkle tree of arity q. The CFM construction notably features short proofs of nonmembership as it makes use of a qTMC scheme with short soft openings. A problem left open is that hard openings still have size O(q), which prevents proofs of membership from being as compact as those of nonmembership. In this paper, we solve this open problem and describe a new qTMC scheme where hard and soft positionwise openings, both, have constant size. We then show how our scheme is amenable to constructing independent zeroknowledge sets (i.e., ZKS schemes that prevent adversaries from correlating their set to the sets of honest provers, as defined by Gennaro and Micali). Our solution retains the short proof property for this important primitive as well. Keywords. Zeroknowledge databases, mercurial commitments, efficiency, independence. 1
Polynomial Commitments
"... We introduce and formally define polynomial commitment schemes, and provide two efficient constructions. A polynomial commitment scheme allows a committer to commit to a polynomial with a short string that can be used by a verifier to confirm claimed evaluations of the committed polynomial. Although ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
We introduce and formally define polynomial commitment schemes, and provide two efficient constructions. A polynomial commitment scheme allows a committer to commit to a polynomial with a short string that can be used by a verifier to confirm claimed evaluations of the committed polynomial. Although the homomorphic commitment schemes in the literature can be used to achieve this goal, the sizes of their commitments are linear in the degree of the committed polynomial. On the other hand, polynomial commitments in our schemes are of constant size (single elements). The overhead of opening a commitment is also constant; even opening multiple evaluations requires only a constant amount of communication overhead. Therefore, our schemes are useful tools to reduce the communication cost in cryptographic protocols. On that front, we apply our polynomial commitment schemes to four problems in cryptography: verifiable secret sharing, zeroknowledge sets, credentials and content extraction signatures. 1
Certification and Authentication of Data Structures
"... We study query authentication schemes, algorithmic and cryptographic constructions that provide efficient and secure protocols for verifying the results of queries over structured data in untrusted or adversarial data distribution environments. We formally define the problem in a new data query and ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
We study query authentication schemes, algorithmic and cryptographic constructions that provide efficient and secure protocols for verifying the results of queries over structured data in untrusted or adversarial data distribution environments. We formally define the problem in a new data query and authentication setting that involves general query types answered in the RAM model of computation, and put forward a new approach for designing secure query authentication schemes that, through the new concept of query certification, aims to authenticate the validity of the answer, rather than the entire process that generates the answer. Our main results state that this new authentication framework achieves generality, namely any query type admits a secure query authentication scheme, and also supports an important type of modularity, namely the authentication of general queries based on the evaluation of relations over the data elements is reduced to the authentication of setmembership queries. Thus, in addition to general possibility results under general assumptions and characterization results using existing cryptographic techniques, we contribute a clear separation between algorithmics and cryptography in dataauthentication protocol design, and sufficient conditions for achieving superefficient answer verification in time asymptotically less than the time needed to answer the query.