#### DMCA

## Inferential or Differential: Privacy Laws Dictate

### Citations

647 | Differential privacy
- Dwork
- 2006
(Show Context)
Citation Context ...l analysis on differential privacy with two results: (i) the differential privacy mechanism does not provide inferential privacy, (ii) the impossibility result about achieving Dalenius’s privacy goal =-=[5]-=- is based on an adversary simulated by a Turing machine, but a human adversary may behave differently; consequently, the practical implication of the impossibility result remains unclear. The second p... |

645 | Calibrating noise to sensitivity in private data analysis
- Dwork, McSherry, et al.
(Show Context)
Citation Context ...nformation about a target record from other records in the database. See [1] for a list of works in this field. One recent breakthrough in the study of privacy preservation is differential privacy [5]=-=[7]-=-. In an “impossibility result”, the authors of [5][7] showed that it is impossible to achieve Dalenius’s absolute privacy goal for statistical databases: anyar X iv :1 20 2. 36 86 v1s[ cs .D B]s1 6 F ... |

412 | Security-control methods for statistical databases
- Adam, Wortmann
- 1989
(Show Context)
Citation Context ...igh accuracy. In this paper, inferential privacy refers to the requirement of limiting the statistical inference of sensitive information about a target record from other records in the database. See =-=[1]-=- for a list of works in this field. One recent breakthrough in the study of privacy preservation is differential privacy [5][7]. In an “impossibility result”, the authors of [5][7] showed that it is i... |

328 |
t-closeness: Privacy beyond k-anonymity and l-diversity.
- Li, Li, et al.
- 2007
(Show Context)
Citation Context ...s. This problem was recently examined in the context of privacy preserving data publishing and some representative privacy models include ρ1-ρ2 privacy [9], `-diversity principle [19], and t-closeness=-=[16]-=-. All of these works assume uniform sensitivity across all sensitive values. One exception is the personalized privacy in [23] where a record owner can specify his/her privacy threshold. Another excep... |

303 | Incognito: Efficient full-domain K-anonymity
- LeFevre, DeWitt, et al.
- 2005
(Show Context)
Citation Context ...ncy of the algorithms proposed in Section 5. For this purpose, we utilized the real data set CENSUS containing personal information of 500K American adults. This data set was previously used in [22], =-=[15]-=- and [19]. Table 5 shows the eight discrete attributes of the data. Two base tables were generated from CENSUS. The first table OCC has Occupation as SA and the 7 remaining attributes as the QI-attrib... |

297 | Limiting privacy breaches in privacy preserving data mining
- Evfimievski, Gehrke, et al.
- 2003
(Show Context)
Citation Context ...tatistical databases, see [1] for a list of works. This problem was recently examined in the context of privacy preserving data publishing and some representative privacy models include ρ1-ρ2 privacy =-=[9]-=-, `-diversity principle [19], and t-closeness[16]. All of these works assume uniform sensitivity across all sensitive values. One exception is the personalized privacy in [23] where a record owner can... |

223 | Practical privacy: The suLQ framework
- Blum, Dwork, et al.
- 2005
(Show Context)
Citation Context ...ed claim of the differential privacy mechanism is that it protects an individual’s information even if an attacker knows about all other individuals in the data. We quote the original discussion from =-=[3]-=- (pp 3): “If there is information about a row that can be learned from other rows, this information is not truly under the control of that row. Even if the row in question were to sequester itself awa... |

218 | A learning theory approach to non-interactive database privacy
- Blum, Ligett, et al.
- 2008
(Show Context)
Citation Context ...pant in the database, by producing noisy query answers such that the distribution of query answers changes very little when the database differs in any single record. The following definition is from =-=[4]-=-. Definition 1. A randomized function K gives ε-differential privacy if for all data sets T and T ′ differing on at most one record, for all queries Q, and for all outputs x, Pr[K(T,Q) = x] ≤ exp(ε)Pr... |

217 |
Kendall’s Advanced Theory of Statistics. Volume 2: Classical Inference and Relationships.
- Stuart, Ord
- 1987
(Show Context)
Citation Context ... be the answers returned by the ε-differential privacy mechanism. E[ Y X ] = y x (1 + 2b 2 x2 ) and var[ Y X ] = 2b 2 x2 (1 + ( y x )2), where b = 1/ε. Proof. Using the Taylor expansion technique [8] =-=[20]-=-, the mean E[ Y X ] and variance var[ Y X ] of Y/X can be approximated as follows: E[ Y X ] ' E[Y ] E[X] + cov[X,Y ] E[X]2 + var[X]E[Y ] E[X]3 var[ Y X ] ' var[Y ] E[X]2 − 2E[Y ] E[X]3 cov[X,Y ] + E[Y... |

192 |
Anatomy: Simple and effective privacy preservation
- Xiao, Tao
(Show Context)
Citation Context ...Census data (Section 7) having the minimum and maximum frequency of 0.18% and 7.5%, the maximum `-diversity [19] that can be provided is 13-diversity because of the eligibility requirement 1/` ≥ 7.5% =-=[22]-=-. Therefore, it is impossible to protect the infrequent items at the tail of the distribution or more sensitive items by a larger `-diversity, say 50-diversity, which is more than 10 times the prior 0... |

164 | Top-Down Specialization for Information and Privacy Preservation.
- Fung, Wang, et al.
- 2005
(Show Context)
Citation Context ... sensitive information in a controlled manner and there are scenarios where it is possible to protect inferential privacy while retaining a reasonable level of data utility. For example, the study in =-=[10]-=- shows that the anonymized data is useful for training a classifier because the training does not depend on detailed personal information. Another scenario is when the utility metric is different from... |

139 |
Survival models and data analysis
- Elandt-Johnson, Johnson
- 1980
(Show Context)
Citation Context ...nd Y be the answers returned by the ε-differential privacy mechanism. E[ Y X ] = y x (1 + 2b 2 x2 ) and var[ Y X ] = 2b 2 x2 (1 + ( y x )2), where b = 1/ε. Proof. Using the Taylor expansion technique =-=[8]-=- [20], the mean E[ Y X ] and variance var[ Y X ] of Y/X can be approximated as follows: E[ Y X ] ' E[Y ] E[X] + cov[X,Y ] E[X]2 + var[X]E[Y ] E[X]3 var[ Y X ] ' var[Y ] E[X]2 − 2E[Y ] E[X]3 cov[X,Y ] ... |

132 | Personalized privacy preservation. - Xiao, Tao - 2006 |

94 |
consistency too: A holistic solution to contingency table release
- Privacy
- 2007
(Show Context)
Citation Context ... and so far there is little satisfactory solution. There have been a great deal of works in differential privacy since the pioneer work [7][5]. This includes, among others, contingency table releases =-=[2]-=-, estimating the degree distribution of social networks [11], histogram queries [12] and the number of permissible queries [24]. These works are concerned with applications of differential privacy in ... |

78 | No free lunch in data privacy. - Kifer, Machanavajjhala - 2011 |

64 | Attacks on privacy and definetti’s theorem.
- Kifer
- 2009
(Show Context)
Citation Context ...uch `-diversity, enforcing `-diversity with a large ` across all sensitive values leads to a large information loss. Finally, previous solutions are vulnerable to additional auxiliary information [21]=-=[13]-=-[17]. We address these issues in three steps. • (Section 3) To address the first two limitations in the above, we consider a sensitive attribute with domain values x1, · · · , xm such that each xi has... |

46 | Accurate estimation of the degree distribution of private networks
- Hay, Li, et al.
- 2009
(Show Context)
Citation Context ...ve been a great deal of works in differential privacy since the pioneer work [7][5]. This includes, among others, contingency table releases [2], estimating the degree distribution of social networks =-=[11]-=-, histogram queries [12] and the number of permissible queries [24]. These works are concerned with applications of differential privacy in various scenarios. Unlike previous works, the authors of [14... |

40 | Boosting the accuracy of differentially-private queries through consistency
- Hay, Rastogi, et al.
- 2009
(Show Context)
Citation Context ...works in differential privacy since the pioneer work [7][5]. This includes, among others, contingency table releases [2], estimating the degree distribution of social networks [11], histogram queries =-=[12]-=- and the number of permissible queries [24]. These works are concerned with applications of differential privacy in various scenarios. Unlike previous works, the authors of [14] argue that hiding the ... |

29 | On anti-corruption privacy preserving publication,” in
- Tao, Xiao, et al.
- 2008
(Show Context)
Citation Context ...ve such `-diversity, enforcing `-diversity with a large ` across all sensitive values leads to a large information loss. Finally, previous solutions are vulnerable to additional auxiliary information =-=[21]-=-[13][17]. We address these issues in three steps. • (Section 3) To address the first two limitations in the above, we consider a sensitive attribute with domain values x1, · · · , xm such that each xi... |

20 | An ad omnia approach to defining and achieving private data analysis. InPinKDD
- Dwork
- 2007
(Show Context)
Citation Context ...evaluating the usefulness of information, may behave differently. Let us explain this point by the Terry Gross example that was originally used to capture the intuition of the impossibility result in =-=[6]-=-. In the Terry Gross example, the exact height is considered private, thus, useful to an adversary, whereas the auxiliary information of being two inches shorter than an unknown average is considered ... |

13 | Zhang,“Modeling and Integrating Background Knowledge in Data Anonymization”,
- Li, Li, et al.
- 2009
(Show Context)
Citation Context ...`-diversity, enforcing `-diversity with a large ` across all sensitive values leads to a large information loss. Finally, previous solutions are vulnerable to additional auxiliary information [21][13]=-=[17]-=-. We address these issues in three steps. • (Section 3) To address the first two limitations in the above, we consider a sensitive attribute with domain values x1, · · · , xm such that each xi has a d... |

13 | Output perturbation with query relaxation.
- Xiao, Tao
- 2008
(Show Context)
Citation Context ...neer work [7][5]. This includes, among others, contingency table releases [2], estimating the degree distribution of social networks [11], histogram queries [12] and the number of permissible queries =-=[24]-=-. These works are concerned with applications of differential privacy in various scenarios. Unlike previous works, the authors of [14] argue that hiding the evidence of participation, instead of the p... |

7 | On Optimal Anonymization for l+-Diversity - Liu, Wang - 2010 |

3 |
l-diversity: privacy beyound k-anonymity
- Machanavajjhala, Kifer, et al.
- 2006
(Show Context)
Citation Context ...ewed distribution and varied sensitivity. For example, with the Occupation attribute in the Census data (Section 7) having the minimum and maximum frequency of 0.18% and 7.5%, the maximum `-diversity =-=[19]-=- that can be provided is 13-diversity because of the eligibility requirement 1/` ≥ 7.5% [22]. Therefore, it is impossible to protect the infrequent items at the tail of the distribution or more sensit... |