• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

DMCA

Privacy in Data Mining Using Formal Methods

Cached

  • Download as a PDF

Download Links

  • [www.csi.uottawa.ca]
  • [www.site.uottawa.ca]
  • [www.site.uottawa.ca]
  • [www.site.uottawa.ca]
  • [www.site.uottawa.ca]
  • [www.site.uottawa.ca]
  • [www.site.uottawa.ca]
  • [www.cs.nott.ac.uk]
  • [www.cs.kun.nl]
  • [www.cs.ru.nl]
  • [www.duplavis.com]
  • [www.cs.nott.ac.uk]
  • [www.cs.ru.nl]

  • Other Repositories/Bibliography

  • DBLP
  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Stan Matwin , Amy Felty , István Hernádvölgyi , Venanzio Capretta
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@MISC{Matwin_privacyin,
    author = {Stan Matwin and Amy Felty and István Hernádvölgyi and Venanzio Capretta},
    title = {Privacy in Data Mining Using Formal Methods},
    year = {}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

Abstract. There is growing public concern about personal data collected by both private and public sectors. People have very little control over what kinds of data are stored and how such data is used. Moreover, the ability to infer new knowledge from existing data is increasing rapidly with advances in database and data mining technologies. We describe a solution which allows people to take control by specifying constraints on the ways in which their data can be used. User constraints are represented in formal logic, and organizations that want to use this data provide formal proofs that the software they use to process data meets these constraints. Checking the proof by an independent verifier demonstrates that user constraints are (or are not) respected by this software. Our notion of “privacy correctness” differs from general software correctness in two ways. First, properties of interest are simpler and thus their proofs should be easier to automate. Second, this kind of correctness is stricter; in addition to showing a certain relation between input and output is realized, we must also show that only operations that respect privacy constraints are applied during execution. We have therefore an intensional notion of correctness, rather that the usual extensional one. We discuss how our mechanism can be put into practice, and we present the technical aspects via an example. Our example shows how users can exercise control when their data is to be used as input to a decision tree learning algorithm. We have formalized the example and the proof of preservation of privacy constraints in Coq. 1

Keyphrases

data mining using formal method    user constraint    privacy constraint    general software correctness    privacy correctness differs    public sector    intensional notion    independent verifier    new knowledge    formal proof    little control    certain relation    decision tree    personal data    data mining technology    formal logic    technical aspect    public concern    usual extensional one    data meet   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University