Results 1 - 10
of
161
Secure Multiparty Computation for Privacy-Preserving Data Mining
, 2008
"... In this paper, we survey the basic paradigms and notions of secure multiparty computation and discuss their relevance to the field of privacy-preserving data mining. In addition to reviewing definitions and constructions for secure multiparty computation, we discuss the issue of efficiency and demon ..."
Abstract
-
Cited by 92 (0 self)
- Add to MetaCart
(Show Context)
In this paper, we survey the basic paradigms and notions of secure multiparty computation and discuss their relevance to the field of privacy-preserving data mining. In addition to reviewing definitions and constructions for secure multiparty computation, we discuss the issue of efficiency and demonstrate the difficulties involved in constructing highly efficient protocols. We also present common errors that are prevalent in the literature when secure multiparty computation techniques are applied to privacy-preserving data mining. Finally, we discuss the relationship between secure multiparty computation and privacy-preserving data mining, and show which problems it solves and which problems it does not. 1
Location privacy via private proximity testing
- In NDSS
, 2011
"... We study privacy-preserving tests for proximity: Alice can test if she is close to Bob without either party revealing any other information about their location. We describe several secure protocols that support private proximity testing at various levels of granularity. We study the use of “locatio ..."
Abstract
-
Cited by 53 (1 self)
- Add to MetaCart
(Show Context)
We study privacy-preserving tests for proximity: Alice can test if she is close to Bob without either party revealing any other information about their location. We describe several secure protocols that support private proximity testing at various levels of granularity. We study the use of “location tags ” generated from the physical environment in order to strengthen the security of proximity testing. We implemented our system on the Android platform and report on its effectiveness. Our system uses a social network (Facebook) to manage user public keys. 1
Private Set Intersection: Are Garbled Circuits Better than Custom Protocols?
, 2012
"... Cryptographic protocols for Private Set Intersection (PSI) are the basis for many important privacy-preserving applications. Over the past few years, intensive research has been devoted to designing custom protocols for PSI based on homomorphic encryption and other public-key techniques, apparently ..."
Abstract
-
Cited by 49 (7 self)
- Add to MetaCart
Cryptographic protocols for Private Set Intersection (PSI) are the basis for many important privacy-preserving applications. Over the past few years, intensive research has been devoted to designing custom protocols for PSI based on homomorphic encryption and other public-key techniques, apparently due to the belief that solutions using generic approaches would be impractical. This paper explores the validity of that belief. We develop three classes of protocols targeted to different set sizes and domains, all based on Yao’s generic garbled-circuit method. We then compare the performance of our protocols to the fastest custom PSI protocols in the literature. Our results show that a careful application of garbled circuits leads to solutions that can run on million-element sets on typical desktops, and that can be competitive with the fastest custom protocols. Moreover, generic protocols like ours can be used directly for performing more complex secure computations, something we demonstrate by adding a simple information-auditing mechanism to our PSI protocols.
SEPIA: Privacy-Preserving Aggregation of Multi-Domain Network Events and Statistics
- USENIX SECURITY SYMPOSIUM
, 2010
"... Secure multiparty computation (MPC) allows joint privacy-preserving computations on data of multiple parties. Although MPC has been studied substantially, building solutions that are practical in terms of computation and communication cost is still a major challenge. In this paper, we investigate th ..."
Abstract
-
Cited by 48 (2 self)
- Add to MetaCart
(Show Context)
Secure multiparty computation (MPC) allows joint privacy-preserving computations on data of multiple parties. Although MPC has been studied substantially, building solutions that are practical in terms of computation and communication cost is still a major challenge. In this paper, we investigate the practical usefulness of MPC for multi-domain network security and monitoring. We first optimize MPC comparison operations for processing high volume data in near real-time. We then design privacy-preserving protocols for event correlation and aggregation of network traffic statistics, such as addition of volume metrics, computation of feature entropy, and distinct item count. Optimizing performance of parallel invocations, we implement our protocols along with a complete set of basic operations in a library called SEPIA. We evaluate the running time and bandwidth requirements of our protocols in realistic settings on a local cluster as well as on PlanetLab and show that they work in near real-time for up to 140 input providers and 9 computation nodes. Compared to implementations using existing general-purpose MPC frameworks, our protocols are significantly faster, requiring, for example, 3 minutes for a task that takes 2 days with general-purpose frameworks. This improvement paves the way for new applications of MPC in the area of networking. Finally, we run SEPIA’s protocols on real traffic traces of 17 networks and show how they provide new possibilities for distributed troubleshooting and early anomaly detection.
Efficient robust private set intersection
- IN: ACNS
, 2009
"... Computing Set Intersection privately and efficiently between two mutually mistrusting parties is an important basic procedure in the area of private data mining. Assuring robustness, namely, coping with potentially arbitrarily misbehaving (i.e., malicious) parties, while retaining protocol efficien ..."
Abstract
-
Cited by 46 (1 self)
- Add to MetaCart
(Show Context)
Computing Set Intersection privately and efficiently between two mutually mistrusting parties is an important basic procedure in the area of private data mining. Assuring robustness, namely, coping with potentially arbitrarily misbehaving (i.e., malicious) parties, while retaining protocol efficiency (rather than employing costly generic techniques) is an open problem. In this work the first solution to this problem is presented.
Practical Private Set Intersection Protocols with Linear Computational and Bandwidth Complexity
, 2010
"... Increasing dependence on anytime-anywhere availability of data and the commensurately increasing fear of losing privacy motivate the need for privacy-preserving techniques. One interesting and common problem occurs when two parties need to privately compute an intersection of their respective sets o ..."
Abstract
-
Cited by 38 (11 self)
- Add to MetaCart
Increasing dependence on anytime-anywhere availability of data and the commensurately increasing fear of losing privacy motivate the need for privacy-preserving techniques. One interesting and common problem occurs when two parties need to privately compute an intersection of their respective sets of data. In doing so, one or both parties must obtain the intersection (if one exists), while neither should learn anything about other set. Although prior work has yielded a number of effective and elegant Private Set Intersection (PSI) techniques, the quest for efficiency is still underway. This paper explores some PSI variations and constructs several secure protocols that are appreciably more efficient than the state-of-the-art.
Computational Differential Privacy
"... The definition of differential privacy has recently emerged as a leading standard of privacy guarantees for algorithms on statistical databases. We offer several relaxations of the definition which require privacy guarantees to hold only against efficient—i.e., computationallybounded—adversaries. W ..."
Abstract
-
Cited by 32 (0 self)
- Add to MetaCart
The definition of differential privacy has recently emerged as a leading standard of privacy guarantees for algorithms on statistical databases. We offer several relaxations of the definition which require privacy guarantees to hold only against efficient—i.e., computationallybounded—adversaries. We establish various relationships among these notions, and in doing so, we observe their close connection with the theory of pseudodense sets by Reingold et al. [1]. We extend the dense model theorem of Reingold et al. to demonstrate equivalence between two definitions (indistinguishability- and simulatability-based) of computational differential privacy. Our computational analogues of differential privacy seem to allow for more accurate constructions than the standard information-theoretic analogues. In particular, in the context of private approximation of the distance between two vectors, we present a differentially-private protocol for computing the approximation, and contrast it with a substantially more accurate protocol that is only computationally differentially private.
Access Control to Information in Pervasive Computing Environments.
- In Proceedings of 9th Workshop on Hot Topics in Operating Systems (HotOS IX),
, 2003
"... Abstract Pervasive computing envisions a world in which our environment is full of embedded devices that gather and share vast amounts of information about people, such as their location, activity, or even their feelings. Some of this information is confidential and should not be released to just a ..."
Abstract
-
Cited by 32 (3 self)
- Add to MetaCart
Abstract Pervasive computing envisions a world in which our environment is full of embedded devices that gather and share vast amounts of information about people, such as their location, activity, or even their feelings. Some of this information is confidential and should not be released to just anyone. In this thesis, I show how existing solutions for controlling access to information are not sufficient for pervasive computing because of four challenges: First, there will be many information services, potentially offering the same information, run by different organizations, even in a single social environment. Second, there will be complex types of information, such as a person's calendar entry, which reveal other kinds of information, such as the person's current location. Third, there will be services that derive specific information, such as a person's activity, from raw information, such as a videostream, and that become attractive targets for intruders. Fourth, an individual's ability to access information could be constrained based on confidential information about the individual's context. This thesis presents a distributed access-control architecture for pervasive computing that supports complex and derived information and confidential context-sensitive constraints. In particular, the thesis makes the following contributions: First, I introduce a distributed accesscontrol architecture, in which a client proves to a service that the client is authorized to access requested information. Second, I show how to incorporate the semantics of complex information as a first-class citizen into this architecture, based on information relationships. Third, I propose derivation-constrained access control, which reduces the influence of intruders by making a service prove that the service is accessing information on behalf of an authorized client. Fourth, I study the kinds of information leaks that context-sensitive constraints can cause. I introduce access-rights graphs and hidden constraints for avoiding these leaks. Fifth, I show how pervasive computing makes it difficult for a client to prove that the client is authorized to access complex confidential information. I propose a cryptographic solution based on an extension of hierarchical identity-based encryption. Sixth, as an alternative approach, I introduce an encryption-based access-control architecture for pervasive computing, in which a service gives information to any client, but only in an encrypted form. I present a formal model for my contributions based on Lampson et al.'s theory of authentication. All of my contributions have been implemented in an actual pervasive computing environment. A performance analysis of my implementation demonstrates the feasibility of my design.
Secure and efficient protocols for iris and fingerprint identification
- ESORICS. LECTURE NOTES IN COMPUTER SCIENCE
, 2011
"... Recent advances in biometric recognition and the increasing use of biometric data prompt significant privacy challenges associated with the possible misuse, loss or theft, of biometric data. Biometric matching isoftenperformedbytwomutuallysuspiciousparties, one ofwhichholdsone biometric image while ..."
Abstract
-
Cited by 30 (10 self)
- Add to MetaCart
(Show Context)
Recent advances in biometric recognition and the increasing use of biometric data prompt significant privacy challenges associated with the possible misuse, loss or theft, of biometric data. Biometric matching isoftenperformedbytwomutuallysuspiciousparties, one ofwhichholdsone biometric image while the other owns a possibly large biometric collection. Due to privacy and liability considerations, neither party is willing to share its data. This gives rise to the need to develop secure computation techniques over biometric data where no information is revealed to the parties except theoutcomeofthecomparisonorsearch. To address the problem, in this work we develop and implement the first privacy-preserving identification protocol for iris codes. We also design and implement a secure protocol for fingerprint identification based on FingerCodes with a substantial improvement in the performance compared to existing solutions. We show that new techniques and optimizations employed in this work allow us to achieve particularly efficient protocols suitable for large data sets and obtain notable performance gain compared to the state-of-the-art prior work.