Results 1  10
of
51
SNARKs for C: Verifying program executions succinctly and in zero knowledge
 In Proceedings of CRYPTO 2013, LNCS
"... An argument system for NP is a proof system that allows efficient verification of NP statements, given proofs produced by an untrusted yet computationallybounded prover. Such a system is noninteractive and publiclyverifiable if, after a trusted party publishes a proving key and a verification key, ..."
Abstract

Cited by 28 (1 self)
 Add to MetaCart
An argument system for NP is a proof system that allows efficient verification of NP statements, given proofs produced by an untrusted yet computationallybounded prover. Such a system is noninteractive and publiclyverifiable if, after a trusted party publishes a proving key and a verification key, anyone can use the proving key to generate noninteractive proofs for adaptivelychosen NP statements, and proofs can be verified by anyone by using the verification key. We present an implementation of a publiclyverifiable noninteractive argument system for NP. The system, moreover, is a zeroknowledge proofofknowledge. It directly proves correct executions of programs on TinyRAM, a randomaccess machine tailored for efficient verification of nondeterministic computations. Given a program P and time bound T, the system allows for proving correct execution of P, on any input x, for up to T steps, after a onetime setup requiring Õ(P  · T) cryptographic operations. An honest prover requires Õ(P  · T) cryptographic operations to generate such a proof, while proof verification can be performed with only O(x) cryptographic operations. This system can be used to prove the correct execution of C programs, using our TinyRAM port of the GCC compiler. This yields a zeroknowledge Succinct Noninteractive ARgument of Knowledge (zkSNARK) for
Public Key Cryptography from Different Assumptions
, 2008
"... We construct a new public key encryption based on two assumptions: 1. One can obtain a pseudorandom generator with small locality by connecting the outputs to the inputs using any sufficiently good unbalanced expander. 2. It is hard to distinguish between a random graph that is such an expander and ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
We construct a new public key encryption based on two assumptions: 1. One can obtain a pseudorandom generator with small locality by connecting the outputs to the inputs using any sufficiently good unbalanced expander. 2. It is hard to distinguish between a random graph that is such an expander and a random graph where a (planted) random logarithmicsized subset S of the outputs is connected to fewer than S  inputs. The validity and strength of the assumptions raise interesting new algorithmic and pseudorandomness questions, and we explore their relation to the current stateofart. 1
Succinct noninteractive arguments via linear . . .
, 2012
"... Succinct noninteractive arguments (SNARGs) enable verifying NP statements with lower complexity than required for classical NP verification. Traditionally, the focus has been on minimizing the length of such arguments; nowadays researches have focused also on minimizing verification time, by drawin ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
Succinct noninteractive arguments (SNARGs) enable verifying NP statements with lower complexity than required for classical NP verification. Traditionally, the focus has been on minimizing the length of such arguments; nowadays researches have focused also on minimizing verification time, by drawing motivation from the problem of delegating computation. A common relaxation is a preprocessing SNARG, which allows the verifier to conduct an expensive offline phase that is independent of the statement to be proven later. Recent constructions of preprocessing SNARGs have achieved attractive features: they are publiclyverifiable, proofs consist of only O(1) encrypted (or encoded) field elements, and verification is via arithmetic circuits of size linear in the NP statement. Additionally, these constructions seem to have “escaped the hegemony ” of probabilisticallycheckable proofs (PCPs) as a basic building block of succinct arguments. We present
Analytical Approach to Parallel Repetition
, 2013
"... We propose an analytical framework for studying parallel repetition, a basic product operation for oneround twoplayer games. In this framework, we consider a relaxation of the value of a game, val+, and prove that for projection games, it is both multiplicative (under parallel repetition) and a go ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
We propose an analytical framework for studying parallel repetition, a basic product operation for oneround twoplayer games. In this framework, we consider a relaxation of the value of a game, val+, and prove that for projection games, it is both multiplicative (under parallel repetition) and a good approximation for the true value. These two properties imply a parallel repetition bound as val(G ⊗k) ≈ val+(G ⊗k) = val+(G) k ≈ val(G) k. Using this framework, we can also give a short proof for the NPhardness of label cover(1, δ) for all δ> 0, starting from the basic PCP theorem. We prove the following new results: – A parallel repetition bound for projection games with low soundness. Previously, it was not known whether parallel repetition decreases the value of such games. This result implies stronger inapproximability bounds for set cover and label cover.
Sound 3query PCPPs are long
, 2008
"... We initiate the study of the tradeoff between the length of a probabilistically checkable proof of proximity (PCPP) and the maximal soundness that can be guaranteed by a 3query verifier with oracle access to the proof. Our main observation is that a verifier limited to querying a short proof cannot ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We initiate the study of the tradeoff between the length of a probabilistically checkable proof of proximity (PCPP) and the maximal soundness that can be guaranteed by a 3query verifier with oracle access to the proof. Our main observation is that a verifier limited to querying a short proof cannot obtain the same soundness as that obtained by a verifier querying a long proof. Moreover, we quantify the soundness deficiency as a function of the prooflength and show that any verifier obtaining “best possible” soundness must query an exponentially long proof. In terms of techniques, we focus on the special class of inspective verifiers that read at most 2 proofbits per invocation. For such verifiers we prove exponential lengthsoundness tradeoffs that are later on used to imply our main results for the case of general (i.e., not necessarily inspective) verifiers. To prove the exponential tradeoff for inspective verifiers we show a connection between PCPP proof length and propertytesting query complexity, that may be of independent interest. The connection is that any linear property that can be verified with proofs of length ℓ by linear inspective verifiers must be testable with query complexity ≈ log ℓ.
Independent set, induced matching, and pricing: connections and tight (subexponential time) approximation hardnesses
 CORR ABS/1308.2617
, 2013
"... ..."
(Show Context)
SubExponential and FPTtime inapproximability of independent set and related problems
 In IPEC
, 2013
"... ar ..."
(Show Context)
FixedParameter and Approximation Algorithms: A New Look
"... A FixedParameter Tractable (FPT) ρapproximation algorithm for a minimization (resp. maximization) parameterized problem P is an FPTalgorithm that, given an instance (x, k) ∈ P computes a solution of cost at most k · ρ(k) (resp. k/ρ(k)) if a solution of cost at most (resp. at least) k exists; ot ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
A FixedParameter Tractable (FPT) ρapproximation algorithm for a minimization (resp. maximization) parameterized problem P is an FPTalgorithm that, given an instance (x, k) ∈ P computes a solution of cost at most k · ρ(k) (resp. k/ρ(k)) if a solution of cost at most (resp. at least) k exists; otherwise the output can be arbitrary. For wellknown intractable problems such as the W[1]hard Clique and W[2]hard Set Cover problems, the natural question is whether we can get any FPTapproximation. It is widely believed that both Clique and SetCover admit no FPT ρapproximation algorithm, for any increasing function ρ. However, to the best of our knowledge, there has been no progress towards proving this conjecture. Assuming standard conjectures such as the Exponential Time Hypothesis (ETH) [18] and the Projection Games Conjecture (PGC) [27], we make the first progress towards proving this conjecture by showing that – Under the ETH and PGC, there exist constants F1, F2> 0 such that the Set Cover problem does not admit a FPT approximation algorithm with ratio k F1 k in 2 F2 · poly(N, M) time, where N is the size of the universe and M is the number of sets. – Unless NP ⊆ SUBEXP, for every 1> δ> 0 there exists a constant F (δ)> 0 such that Clique has no FPT cost approximation with ratio k 1−δ in 2 kF · poly(n) time, where n is the number of vertices in the graph. In the second part of the paper we consider various W[1]hard problems
New directproduct testers and 2query PCPs
 IN PROCEEDINGS OF THE FORTYFIRST ANNUAL ACM SYMPOSIUM ON THEORY OF COMPUTING
, 2009
"... The “direct product code” of a function f gives its values on all ktuples (f(x1),..., f(xk)). This basic construct underlies “hardness amplification ” in cryptography, circuit complexity and PCPs. Goldreich and Safra [GS00] pioneered its local testing and its PCP application. A recent result by Din ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
The “direct product code” of a function f gives its values on all ktuples (f(x1),..., f(xk)). This basic construct underlies “hardness amplification ” in cryptography, circuit complexity and PCPs. Goldreich and Safra [GS00] pioneered its local testing and its PCP application. A recent result by Dinur and Goldenberg [DG08] enabled for the first time testing proximity to this important code in the “listdecoding” regime. In particular, they give a 2query test which works for polynomially small success probability 1/kα, and show that no such test works below success probability 1/k. Our main result is a 3query test which works for exponentially small success probability exp(−kα). Our techniques (based on recent simplified decoding algorithms for the same code [IJKW08]) also allow us to considerably simplify the analysis of the 2query test of [DG08]. We then show how to derandomize their test, achieving a code of polynomial rate, independent of k, and success probability 1/kα. Finally we show the applicability of the new tests to PCPs. Starting with a 2query PCP over an alphabet Σ and with soundness error 1 − δ, Rao [Rao08] (building on Raz’s (kfold)
On the Optimality of Semidefinite Relaxations for AverageCase and Generalized Constraint
, 2012
"... This work studies several questions about the optimality of semidefinite programming (SDP) for constraint satisfaction problems (CSPs). First we propose the hypothesis that the well known Basic SDP relaxation is actually optimal for random instances of constraint satisfaction problems for every pred ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
This work studies several questions about the optimality of semidefinite programming (SDP) for constraint satisfaction problems (CSPs). First we propose the hypothesis that the well known Basic SDP relaxation is actually optimal for random instances of constraint satisfaction problems for every predicate. This unifies several conjectures proposed in the past, and suggests a unifying principle for the averagecase complexity of CSPs. We provide several types of indirect evidence for the truth of this hypothesis, and also show that it (and its variants) imply several conjectures in hardness of approximation including polynomial factor hardness for the densest k subgraph problem and hard instances for the Sliding Scale Conjecture of Bellare, Goldwasser, Lund and Russell (1993). Second, we observe that for every predicate P, the basic SDP relaxation achieves the same approximation guarantee for the CSP for P and for a more general problem (involving not just Boolean but constrained vector