• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations | Disambiguate

DMCA

Calibrating noise to sensitivity in private data analysis (2006)

Cached

  • Download as a PDF

Download Links

  • [www.cs.bgu.ac.il]
  • [www.cs.bgu.ac.il]
  • [research.microsoft.com]
  • [research.microsoft.com]
  • [research.microsoft.com]
  • [research.microsoft.com]
  • [www.iacr.org]
  • [www.iacr.org]
  • [www.csail.mit.edu]
  • [www.cse.psu.edu]
  • [theory.lcs.mit.edu]
  • [www.cse.psu.edu]
  • [people.csail.mit.edu]
  • [www.cse.psu.edu]

  • Other Repositories/Bibliography

  • DBLP
  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Cynthia Dwork , Frank Mcsherry , Kobbi Nissim , Adam Smith
Venue:In Proceedings of the 3rd Theory of Cryptography Conference
Citations:645 - 60 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@INPROCEEDINGS{Dwork06calibratingnoise,
    author = {Cynthia Dwork and Frank Mcsherry and Kobbi Nissim and Adam Smith},
    title = {Calibrating noise to sensitivity in private data analysis},
    booktitle = {In Proceedings of the 3rd Theory of Cryptography Conference},
    year = {2006},
    pages = {265--284},
    publisher = {Springer}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

Abstract. We continue a line of research initiated in [10, 11] on privacypreserving statistical databases. Consider a trusted server that holds a database of sensitive information. Given a query function f mapping databases to reals, the so-called true answer is the result of applying f to the database. To protect privacy, the true answer is perturbed by the addition of random noise generated according to a carefully chosen distribution, and this response, the true answer plus noise, is returned to the user. Previous work focused on the case of noisy sums, in which f =P i g(xi), where xi denotes the ith row of the database and g maps data-base rows to [0, 1]. We extend the study to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f. Roughly speaking, this is the amount that any single argument to f can change its output. The new analysis shows that for several particular applications substantially less noise is needed than was previously understood to be the case. The first step is a very clean characterization of privacy in terms of indistinguishability of transcripts. Additionally, we obtain separation results showing the increased value of interactive sanitization mechanisms over non-interactive. 1 Introduction We continue a line of research initiated in [10, 11] on privacy in statistical data-bases. A statistic is a quantity computed from a sample. Intuitively, if the database is a representative sample of an underlying population, the goal ofa privacy-preserving statistical database is to enable the user to learn properties of the population as a whole while protecting the privacy of the individualcontributors.

Keyphrases

private data analysis    true answer    ith row    roughly speaking    map data-base row    interactive sanitization mechanism    clean characterization    statistical database    previous work    new analysis    representative sample    query function mapping    privacy-preserving statistical database    trusted server    single argument    sensitive information    statistical data-bases    random noise    several particular application    separation result    underlying population    general function    first step    standard deviation    so-called true answer    noisy sum   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University