Results 1  10
of
10
ConstantSize Commitments to Polynomials and Their Applications
 In Proceedings of ASIACRYPT 2010
, 2010
"... Abstract. We introduce and formally define polynomial commitment schemes, and provide two efficient constructions. A polynomial commitment scheme allows a committer to commit to a polynomial with a short string that can be used by a verifier to confirm claimed evaluations of the committed polynomial ..."
Abstract

Cited by 24 (9 self)
 Add to MetaCart
(Show Context)
Abstract. We introduce and formally define polynomial commitment schemes, and provide two efficient constructions. A polynomial commitment scheme allows a committer to commit to a polynomial with a short string that can be used by a verifier to confirm claimed evaluations of the committed polynomial. Although the homomorphic commitment schemes in the literature can be used to achieve this goal, the sizes of their commitments are linear in the degree of the committed polynomial. On the other hand, polynomial commitments in our schemes are of constant size (single elements). The overhead of opening a commitment is also constant; even opening multiple evaluations requires only a constant amount of communication overhead. Therefore, our schemes are useful tools to reduce the communication cost in cryptographic protocols. On that front, we apply our polynomial commitment schemes to four problems in cryptography: verifiable secret sharing, zeroknowledge sets, credentials and content extraction signatures.
Concise Mercurial Vector Commitments and Independent ZeroKnowledge Sets with Short Proofs
"... Abstract. Introduced by Micali, Rabin and Kilian (MRK), the basic primitive of zeroknowledge sets (ZKS) allows a prover to commit to a secret set S so as to be able to prove statements such as x ∈ S or x ̸ ∈ S. Chase et al. showed that ZKS protocols are underlain by a cryptographic primitive termed ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
Abstract. Introduced by Micali, Rabin and Kilian (MRK), the basic primitive of zeroknowledge sets (ZKS) allows a prover to commit to a secret set S so as to be able to prove statements such as x ∈ S or x ̸ ∈ S. Chase et al. showed that ZKS protocols are underlain by a cryptographic primitive termed mercurial commitment. A (trapdoor) mercurial commitment has two commitment procedures. At committing time, the committer can choose not to commit to a specific message and rather generate a dummy value which it will be able to softly open to any message without being able to completely open it. Hard commitments, on the other hand, can be hardly or softly opened to only one specific message. At Eurocrypt 2008, Catalano, Fiore and Messina (CFM) introduced an extension called trapdoor qmercurial commitment (qTMC), which allows committing to a vector of q messages. These qTMC schemes are interesting since their openings w.r.t. specific vector positions can be short (ideally, the opening length should not depend on q), which provides zeroknowledge sets with much shorter proofs when such a commitment is combined with a Merkle tree of arity q. The CFM construction notably features short proofs of nonmembership as it makes use of a qTMC scheme with short soft openings. A problem left open is that hard openings still have size O(q), which prevents proofs of membership from being as compact as those of nonmembership. In this paper, we solve this open problem and describe a new qTMC scheme where hard and soft positionwise openings, both, have constant size. We then show how our scheme is amenable to constructing independent zeroknowledge sets (i.e., ZKS schemes that prevent adversaries from correlating their set to the sets of honest provers, as defined by Gennaro and Micali). Our solution retains the short proof property for this important primitive as well. Keywords. Zeroknowledge databases, mercurial commitments, efficiency, independence. 1
Theory and Application of Extractable Functions
, 2009
"... We propose a new cryptographic primitive, called extractable functions. An extractable function guarantees any machine that manages to output a point in the range of this function knows a corresponding preimage. Wecapture knowledgeofpreimage bywayofalgorithmicextraction. Weformulate twomainvariantso ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We propose a new cryptographic primitive, called extractable functions. An extractable function guarantees any machine that manages to output a point in the range of this function knows a corresponding preimage. Wecapture knowledgeofpreimage bywayofalgorithmicextraction. Weformulate twomainvariantsofextractability,namelynoninteractiveandinteractive. Thenoninteractive variant can be regarded as a generalization from speci c knowledge assumptions to a notion that is formulated in general computational terms. Indeed, we show how to realize it under several di erent assumptions. On the other hand, interactive extraction can be realized from certain perfectly oneway (POW) functions or veri able secretsharing (VSS) schemes. Wetheninitiateamoregeneralstudyofextractablefunctionaimedatunderstanding theconceptofextractabilityinofitself. Inparticularwedemonstratethataweaknotion of extraction implies a strong one, and make rigorous the intuition that extraction and obfuscation are complementary notions. We demonstrate the usefulness of the new primitive in two quite di erent settings.
Polynomial Commitments
"... We introduce and formally define polynomial commitment schemes, and provide two efficient constructions. A polynomial commitment scheme allows a committer to commit to a polynomial with a short string that can be used by a verifier to confirm claimed evaluations of the committed polynomial. Although ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
We introduce and formally define polynomial commitment schemes, and provide two efficient constructions. A polynomial commitment scheme allows a committer to commit to a polynomial with a short string that can be used by a verifier to confirm claimed evaluations of the committed polynomial. Although the homomorphic commitment schemes in the literature can be used to achieve this goal, the sizes of their commitments are linear in the degree of the committed polynomial. On the other hand, polynomial commitments in our schemes are of constant size (single elements). The overhead of opening a commitment is also constant; even opening multiple evaluations requires only a constant amount of communication overhead. Therefore, our schemes are useful tools to reduce the communication cost in cryptographic protocols. On that front, we apply our polynomial commitment schemes to four problems in cryptography: verifiable secret sharing, zeroknowledge sets, credentials and content extraction signatures. 1
On ConstantRound Concurrent ZeroKnowledge from a Knowledge Assumption
, 2012
"... In this work, we consider the longstanding open question of constructing constantround concurrent zeroknowledge protocols in the plain model. Resolving this question is known to require nonblackbox techniques. We consider nonblackbox techniques for zeroknowledge based on knowledge assumption ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this work, we consider the longstanding open question of constructing constantround concurrent zeroknowledge protocols in the plain model. Resolving this question is known to require nonblackbox techniques. We consider nonblackbox techniques for zeroknowledge based on knowledge assumptions, a line of thinking initiated by the work of Hada and Tanaka (CRYPTO 1998). Prior to our work, it was not known whether knowledge assumptions could be used for achieving security in the concurrent setting, due to a number of significant limitations that we discuss here. Nevertheless, we obtain the following results: 1. We obtain the first constant round concurrent zeroknowledge argument for NP in the plain model based on a new variant of knowledge of exponent assumption. Furthermore, our construction avoids the inefficiency inherent in previous nonblackbox techniques such that those of Barak (FOCS 2001); we obtain our result through an efficient protocol compiler. 2. Unlike Hada and Tanaka, we do not require a knowledge assumption to argue the soundness
PrimarySecondaryResolver Membership Proof Systems
, 2014
"... We consider PrimarySecondaryResolver Membership Proof Systems (PSR for short) and show different constructions of that primitive. A PSR system is a 3party protocol, where we have a primary, which is a trusted party which commits to a set of members and their values, then generates a public and se ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
We consider PrimarySecondaryResolver Membership Proof Systems (PSR for short) and show different constructions of that primitive. A PSR system is a 3party protocol, where we have a primary, which is a trusted party which commits to a set of members and their values, then generates a public and secret keys in order for secondaries (provers with knowledge of both keys) and resolvers (verifiers who only know the public key) to engage in interactive proof sessions regarding elements in the universe and their values. The motivation for such systems is for constructing a secure Domain Name System (DNSSEC) that does not reveal any unnecessary information to its clients. We require our systems to be complete, so honest executions will result in correct conclusions by the resolvers, sound, so malicious secondaries cannot cheat resolvers, and zeroknowledge, so resolvers will not learn additional information about elements they did not query explicitly. Providing proofs of membership is easy, as the primary can simply precompute signatures over all the members of the set. Providing proofs of nonmembership, i.e. a denialofexistence mechanism, is trickier and is the main issue in constructing PSR systems.
On Invertible Sampling and Adaptive Security
"... Abstract Secure multiparty computation (MPC) is one of the most general and well studied problems in cryptography. We focus on MPC protocols that are required to be secure even when the adversary can adaptively corrupt parties during the protocol, and under the assumption that honest parties cannot ..."
Abstract
 Add to MetaCart
Abstract Secure multiparty computation (MPC) is one of the most general and well studied problems in cryptography. We focus on MPC protocols that are required to be secure even when the adversary can adaptively corrupt parties during the protocol, and under the assumption that honest parties cannot reliably erase their secrets prior to corruption. Previous feasibility results for adaptively secure MPC in this setting applied either to deterministic functionalities or to randomized functionalities which satisfy a certain technical requirement. The question whether adaptive security is possible for all functionalities was left open. We provide the first convincing evidence that the answer to this question is negative, namely that some (randomized) functionalities cannot be realized with adaptive security. We obtain this result by studying the following related invertible sampling problem: given an efficient sampling algorithm A, obtain another sampling algorithm B such that the output of B is computationally indistinguishable from the output of A, but B can be efficiently inverted (even if A cannot). This invertible sampling problem is independently motivated by other cryptographic applications. We show, under strong but well studied assumptions, that there exist efficient sampling algorithms A for which invertible sampling as above is impossible. At the same time, we show that a general feasibility result for adaptively secure MPC implies that invertible sampling is possible for every A, thereby reaching a contradiction and establishing our main negative result. 1
Recursive composition and bootstrapping . . .
, 2012
"... Succinct noninteractive arguments of knowledge (SNARKs), and their generalization to distributed computations by proofcarrying data (PCD), are powerful tools for enforcing the correctness of computations in dynamic networks with multiple mutuallyuntrusting parties, with essentially minimal comput ..."
Abstract
 Add to MetaCart
Succinct noninteractive arguments of knowledge (SNARKs), and their generalization to distributed computations by proofcarrying data (PCD), are powerful tools for enforcing the correctness of computations in dynamic networks with multiple mutuallyuntrusting parties, with essentially minimal computational overhead. Current constructions achieve only variants with expensive setup, restricted functionality, or oracles. We present recursive composition and bootstrapping techniques that: 1. Transform any SNARK with an expensive preprocessing phase into a SNARK without such a phase. 2. Transform any SNARK into a PCD system for constantdepth distributed computations. 3. Transform any PCD system for constantdepth distributed computations into a PCD system for distributed computation over paths of fixed polynomial length. Our transformations apply to both the public and private verification settings, and assume the existence of CRHs (and FHE, for the privateverification setting). By plugging into our transformations the NIZKs of [Groth, ASIACRYPT ’10], whose security is based on a Knowledge of Exponent assumption in bilinear groups, we obtain the first publiclyverifiable
Recursive composition and bootstrapping for SNARKs . . .
, 2012
"... Succinct noninteractive arguments of knowledge (SNARKs), and their generalization to distributed computations by proofcarrying data (PCD), are powerful tools for enforcing the correctness of dynamically evolving computations among multiple mutuallyuntrusting parties. We present recursive composit ..."
Abstract
 Add to MetaCart
Succinct noninteractive arguments of knowledge (SNARKs), and their generalization to distributed computations by proofcarrying data (PCD), are powerful tools for enforcing the correctness of dynamically evolving computations among multiple mutuallyuntrusting parties. We present recursive composition and bootstrapping techniques that: 1. Transform any SNARK with an expensive preprocessing phase into a SNARK without such a phase. 2. Transform any SNARK into a PCD system for constantdepth distributed computations. 3. Transform any PCD system for constantdepth distributed computations into a PCD system for distributed computation over paths of fixed polynomial length. Our transformations apply to both the public and privateverification settings, and assume the existence of CRHs; for the privateverification setting, we additionally assume FHE. By applying our transformations to the NIZKs of [Groth, ASIACRYPT ’10], whose security is based on a Knowledge of Exponent assumption in bilinear groups, we obtain the first publiclyverifiable SNARKs and PCD without preprocessing in the plain model. (Previous constructions were either in the randomoracle model [Micali, FOCS ’94] or in a signature oracle model [Chiesa and Tromer, ICS ’10].) Interestingly,