Results 1  10
of
61
Proof verification and hardness of approximation problems
 IN PROC. 33RD ANN. IEEE SYMP. ON FOUND. OF COMP. SCI
, 1992
"... We show that every language in NP has a probablistic verifier that checks membership proofs for it using logarithmic number of random bits and by examining a constant number of bits in the proof. If a string is in the language, then there exists a proof such that the verifier accepts with probabilit ..."
Abstract

Cited by 770 (38 self)
 Add to MetaCart
We show that every language in NP has a probablistic verifier that checks membership proofs for it using logarithmic number of random bits and by examining a constant number of bits in the proof. If a string is in the language, then there exists a proof such that the verifier accepts with probability 1 (i.e., for every choice of its random string). For strings not in the language, the verifier rejects every provided “proof " with probability at least 1/2. Our result builds upon and improves a recent result of Arora and Safra [6] whose verifiers examine a nonconstant number of bits in the proof (though this number is a very slowly growing function of the input length). As a consequence we prove that no MAX SNPhard problem has a polynomial time approximation scheme, unless NP=P. The class MAX SNP was defined by Papadimitriou and Yannakakis [82] and hard problems for this class include vertex cover, maximum satisfiability, maximum cut, metric TSP, Steiner trees and shortest superstring. We also improve upon the clique hardness results of Feige, Goldwasser, Lovász, Safra and Szegedy [42], and Arora and Safra [6] and shows that there exists a positive ɛ such that approximating the maximum clique size in an Nvertex graph to within a factor of N ɛ is NPhard.
Property Testing and its connection to Learning and Approximation
"... We study the question of determining whether an unknown function has a particular property or is fflfar from any function with that property. A property testing algorithm is given a sample of the value of the function on instances drawn according to some distribution, and possibly may query the fun ..."
Abstract

Cited by 469 (67 self)
 Add to MetaCart
We study the question of determining whether an unknown function has a particular property or is fflfar from any function with that property. A property testing algorithm is given a sample of the value of the function on instances drawn according to some distribution, and possibly may query the function on instances of its choice. First, we establish some connections between property testing and problems in learning theory. Next, we focus on testing graph properties, and devise algorithms to test whether a graph has properties such as being kcolorable or having a aeclique (clique of density ae w.r.t the vertex set). Our graph property testing algorithms are probabilistic and make assertions which are correct with high probability, utilizing only poly(1=ffl) edgequeries into the graph, where ffl is the distance parameter. Moreover, the property testing algorithms can be used to efficiently (i.e., in time linear in the number of vertices) construct partitions of the graph which corre...
Free Bits, PCPs and NonApproximability  Towards Tight Results
, 1996
"... This paper continues the investigation of the connection between proof systems and approximation. The emphasis is on proving tight nonapproximability results via consideration of measures like the "free bit complexity" and the "amortized free bit complexity" of proof systems. ..."
Abstract

Cited by 212 (39 self)
 Add to MetaCart
This paper continues the investigation of the connection between proof systems and approximation. The emphasis is on proving tight nonapproximability results via consideration of measures like the "free bit complexity" and the "amortized free bit complexity" of proof systems.
Testing Monotonicity
, 1999
"... We present a (randomized) test for monotonicity of Boolean functions. Namely, given the ability to query an unknown function f : f0; 1g 7! f0; 1g at arguments of its choice, the test always accepts a monotone f , and rejects f with high probability if it is fflfar from being monotone (i.e., e ..."
Abstract

Cited by 78 (16 self)
 Add to MetaCart
(Show Context)
We present a (randomized) test for monotonicity of Boolean functions. Namely, given the ability to query an unknown function f : f0; 1g 7! f0; 1g at arguments of its choice, the test always accepts a monotone f , and rejects f with high probability if it is fflfar from being monotone (i.e., every monotone function differs from f on more than an ffl fraction of the domain).
Combinatorial Property Testing (a survey)
 In: Randomization Methods in Algorithm Design
, 1998
"... We consider the question of determining whether a given object has a predetermined property or is "far" from any object having the property. Specifically, objects are modeled by functions, and distance between functions is measured as the fraction of the domain on which the functions diffe ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
(Show Context)
We consider the question of determining whether a given object has a predetermined property or is "far" from any object having the property. Specifically, objects are modeled by functions, and distance between functions is measured as the fraction of the domain on which the functions differ. We consider (randomized) algorithms which may query the function at arguments of their choice, and seek algorithms which query the function at relatively few places. We focus on combinatorial properties, and specifically on graph properties. The two standard representations of graphs  by adjacency matrices and by incidence lists  yield two different models for testing graph properties. In the first model, most appropriate for dense graphs, distance between Nvertex graphs is measured as the fraction of edges on which the graphs disagree over N 2 . In the second model, most appropriate for boundeddegree graphs, distance between Nvertex ddegree graphs is measured as the fraction of edges on ...
Improved Testing Algorithms for Monotonicity
, 1999
"... We present improved algorithms for testing monotonicity of functions. Namely, given theability to query an unknown function f: \Sigma n 7! \Xi, where \Sigma and \Xi are finite ordered sets,the test always accepts a monotone f, and rejects f with high probability if it is fflfar frombeing monotone ( ..."
Abstract

Cited by 48 (12 self)
 Add to MetaCart
(Show Context)
We present improved algorithms for testing monotonicity of functions. Namely, given theability to query an unknown function f: \Sigma n 7! \Xi, where \Sigma and \Xi are finite ordered sets,the test always accepts a monotone f, and rejects f with high probability if it is fflfar frombeing monotone (i.e., every monotone function differs from f on more than an ffl fraction of thedomain). For any ffl> 0, the query and time complexities of the test are O((n/ffl)*log \Sigma *log \Xi ).The previous best known bound was ~
Testing lowdegree polynomials over GF(2)
 PROCEEDINGS OF RANDOMAPPROX
, 2003
"... We describe an efficient randomized algorithm to test if a given binary function f: {0, 1} n → {0, 1} is a lowdegree polynomial (that is, a sum of lowdegree monomials). For a given integer k ≥ 1 and a given real ɛ> 0, the algorithm queries f at O ( 1 ɛ + k4k) points. If f is a polynomial of deg ..."
Abstract

Cited by 45 (8 self)
 Add to MetaCart
We describe an efficient randomized algorithm to test if a given binary function f: {0, 1} n → {0, 1} is a lowdegree polynomial (that is, a sum of lowdegree monomials). For a given integer k ≥ 1 and a given real ɛ> 0, the algorithm queries f at O ( 1 ɛ + k4k) points. If f is a polynomial of degree at most k, the algorithm always accepts, and if the value of f has to be modified on at least an ɛ fraction of all inputs in order to transform it to such a polynomial, then the algorithm rejects with probability at least 2/3. Our result is essentially tight: Any algorithm for testing degreek polynomials over GF (2) must perform Ω ( 1 ɛ + 2k) queries.
Making argument systems for outsourced computation practical (sometimes
 In NDSS
, 2012
"... This paper describes the design, implementation, and evaluation of a system for performing verifiable outsourced computation. It has long been known that (1) this problem can be solved in theory using probabilistically checkable proofs (PCPs) coupled with modern cryptographic tools, and (2) these ..."
Abstract

Cited by 36 (7 self)
 Add to MetaCart
(Show Context)
This paper describes the design, implementation, and evaluation of a system for performing verifiable outsourced computation. It has long been known that (1) this problem can be solved in theory using probabilistically checkable proofs (PCPs) coupled with modern cryptographic tools, and (2) these solutions have wholly impractical performance, according to the conventional (and wellfounded) wisdom. Our goal is to challenge (2), with a built system that implements an argument system based on PCPs. We describe a generalpurpose system that builds on work of Ishai et al. (CCC ’07) and incorporates new theoretical work to improve performance by 20 orders of magnitude. The system is (arguably) practical in some cases, suggesting that, as a tool for building secure systems, PCPs are not a lost cause. 1
Taking proofbased verified computation a few steps closer to practicality
 In USENIX Security
, 2012
"... Abstract. We describe GINGER, a built system for unconditional, generalpurpose, and nearly practical verification of outsourced computation. GINGER is based on PEPPER, which uses the PCP theorem and cryptographic techniques to implement an efficient argument system (a kind of interactive protocol). ..."
Abstract

Cited by 27 (6 self)
 Add to MetaCart
(Show Context)
Abstract. We describe GINGER, a built system for unconditional, generalpurpose, and nearly practical verification of outsourced computation. GINGER is based on PEPPER, which uses the PCP theorem and cryptographic techniques to implement an efficient argument system (a kind of interactive protocol). GINGER slashes the query size and costs via theoretical refinements that are of independent interest; broadens the computational model to include (primitive) floatingpoint fractions, inequality comparisons, logical operations, and conditional control flow; and includes a parallel GPUbased implementation that dramatically reduces latency. 1