Results 1 - 10
of
80
De-anonymizing social networks
, 2009
"... Operators of online social networks are increasingly sharing potentially sensitive information about users and their relationships with advertisers, application developers, and data-mining researchers. Privacy is typically protected by anonymization, i.e., removing names, addresses, etc. We present ..."
Abstract
-
Cited by 216 (6 self)
- Add to MetaCart
(Show Context)
Operators of online social networks are increasingly sharing potentially sensitive information about users and their relationships with advertisers, application developers, and data-mining researchers. Privacy is typically protected by anonymization, i.e., removing names, addresses, etc. We present a framework for analyzing privacy and anonymity in social networks and develop a new re-identification algorithm targeting anonymized socialnetwork graphs. To demonstrate its effectiveness on realworld networks, we show that a third of the users who can be verified to have accounts on both Twitter, a popular microblogging service, and Flickr, an online photo-sharing site, can be re-identified in the anonymous Twitter graph with only a 12 % error rate. Our de-anonymization algorithm is based purely on the network topology, does not require creation of a large number of dummy “sybil” nodes, is robust to noise and all existing defenses, and works even when the overlap between the target network and the adversary’s auxiliary information is small.
Persona: An Online Social Network with User-Defined Privacy
"... Online social networks (OSNs) are immensely popular, with some claiming over 200 million users [10]. Users share private content, such as personal information or photographs, using OSN applications. Users must trust the OSN service to protect personal information even as the OSN provider benefits fr ..."
Abstract
-
Cited by 145 (4 self)
- Add to MetaCart
(Show Context)
Online social networks (OSNs) are immensely popular, with some claiming over 200 million users [10]. Users share private content, such as personal information or photographs, using OSN applications. Users must trust the OSN service to protect personal information even as the OSN provider benefits from examining and sharing that information. We present Persona, an OSN where users dictate who may access their information. Persona hides user data with attribute-based encryption (ABE), allowing users to apply fine-grained policies over who may view their data. Persona provides an effective means of creating applications in which users, not the OSN, define policy over access to private data. We demonstrate new cryptographic mechanisms that enhance the general applicability of ABE. We show how Persona provides the functionality of existing online social networks with additional privacy benefits. We describe an implementation of Persona that replicates Facebook applications and show that Persona provides acceptable performance when browsing privacy-enhanced web pages, even on mobile devices.
Privacy wizards for social networking sites
- in WWW ’10: Proceedings of the 19th International World Wide Web Conference
, 2010
"... Privacy is an enormous problem in online social networking sites. While sites such as Facebook allow users fine-grained control over who can see their profiles, it is difficult for average users to specify this kind of detailed policy. In this paper, we propose a template for the design of a social ..."
Abstract
-
Cited by 95 (2 self)
- Add to MetaCart
(Show Context)
Privacy is an enormous problem in online social networking sites. While sites such as Facebook allow users fine-grained control over who can see their profiles, it is difficult for average users to specify this kind of detailed policy. In this paper, we propose a template for the design of a social networking privacy wizard. The intuition for the design comes from the observation that real users conceive their privacy preferences (which friends should be able to see which information) based on an implicit set of rules. Thus, with a limited amount of user input, it is usually possible to build a machine learning model that concisely describes a particular user’s preferences, and then use this model to configure the user’s privacy settings automatically. As an instance of this general framework, we have built a wizard based on an active learning paradigm called uncertainty sampling. The wizard iteratively asks the user to assign privacy “labels ” to selected (“informative”) friends, and it uses this input to construct a classifier, which can in turn be used to automatically assign privileges to the rest of the user’s (unlabeled) friends. To evaluate our approach, we collected detailed privacy preference data from 45 real Facebook users. Our study revealed two important things. First, real users tend to conceive their privacy preferences in terms of communities, which can easily be extracted from a social network graph using existing techniques. Second, our active learning wizard, using communities as features, is able to recommend high-accuracy privacy settings using less user input than existing policy-specification tools.
Facecloak: An architecture for user privacy on social networking sites
- In Proceedings of 2009 IEEE International Conference on Privacy, Security, Risk and Trust (PASSAT-09
, 2009
"... Abstract—Social networking sites, such as MySpace, Facebook andFlickr,aregainingmore andmorepopularityamongInternet users. As users are enjoying this new style of networking, privacy concerns are also attracting increasing public attention due to reports about privacy breaches on social networking s ..."
Abstract
-
Cited by 54 (1 self)
- Add to MetaCart
(Show Context)
Abstract—Social networking sites, such as MySpace, Facebook andFlickr,aregainingmore andmorepopularityamongInternet users. As users are enjoying this new style of networking, privacy concerns are also attracting increasing public attention due to reports about privacy breaches on social networking sites. We propose FaceCloak, an architecture that protects user privacy on asocialnetworkingsitebyshieldingauser’spersonalinformation from the site and from other users that were not explicitly authorized by the user. At the same time, FaceCloak seamlessly maintainsusabilityof the site’s services. FaceCloak achieves these goals by providing fake information to the social networking site and by storing sensitive information in encrypted form on a separate server. We implemented our solution as a Firefox browser extension for the Facebook platform. Our experiments show that our solution successfully conceals a user’s personal information, while allowing the user and her friends to explore Facebook pages and services as usual. I.
EASiER: Encryptionbased access control in social networks with efficient revocation
- In Proc. ASIACCS
, 2011
"... A promisingapproachto mitigatetheprivacyrisks in Online Social Networks (OSNs) is to shift access control enforcement from the OSN provider to the user by means of encryption. However, this createsthechallengeof key management to support complex policies involved in OSNs and dynamic groups. To addre ..."
Abstract
-
Cited by 42 (4 self)
- Add to MetaCart
(Show Context)
A promisingapproachto mitigatetheprivacyrisks in Online Social Networks (OSNs) is to shift access control enforcement from the OSN provider to the user by means of encryption. However, this createsthechallengeof key management to support complex policies involved in OSNs and dynamic groups. To address this, we propose EASiER, an architecture that supports fine-grained access control policies and dynamic group membership by using attribute-based encryption. A key and novel feature of our architecture, however, is that it is possible to remove access from a user without issuing new keys to other users or re-encrypting existing ciphertexts. We achieve this by creating a proxy that participates in the decryption process and enforces revocation constraints. The proxy is minimally trusted and cannot decrypt ciphertexts or provide access to previously revoked users. We describe EASiER architecture and construction, provide performance evaluation, and prototype application of our approach on Facebook.
Measuring Privacy Risk in Online Social Networks
"... Measuring privacy risk in online social networks is a challenging task. One of the fundamental difficulties is quantifying the amount of information revealed unintentionally. We present PrivAware, a tool to detect and report unintended information loss in online social networks. Our goal is to provi ..."
Abstract
-
Cited by 35 (0 self)
- Add to MetaCart
(Show Context)
Measuring privacy risk in online social networks is a challenging task. One of the fundamental difficulties is quantifying the amount of information revealed unintentionally. We present PrivAware, a tool to detect and report unintended information loss in online social networks. Our goal is to provide a rudimentary framework to identify privacy risk and provide solutions to reduce information loss. The first instance of the software is focused on information loss attributed to social circles. In subsequent releases we intend to incorporate additional capabilities to capture ancillary threat models. From our initial results, we quantify the privacy risk attributed to friend relationships in Facebook. We show that for each user in our study a majority of their personal attributes can be derived from social contacts. Moreover, we present results denoting the number of friends contributing to a correctly inferred attribute. We also provide similar results for different demographics of users. The intent of PrivAware is to not only report information loss but to recommend user actions to mitigate privacy risk. The actions provide users with the steps necessary to improve their overall privacy measurement. One obvious, but not ideal, solution is to remove risky friends. Another approach is to group risky friends and apply access controls to the group to limit visibility. In summary, our goal is to provide a unique tool to quantify information loss and provide features to reduce privacy risk. 1.
Vis-à-vis: Privacy-preserving online social networking via virtual individual servers
- In COMSNETS
, 2011
"... Abstract—Online social networks (OSNs) are immensely popular, but their centralized control of user data raises important privacy concerns. This paper presents Vis-à-Vis, a decentralized framework for OSNs based on the privacy-preserving notion of a Virtual Individual Server (VIS). A VIS is a person ..."
Abstract
-
Cited by 33 (1 self)
- Add to MetaCart
(Show Context)
Abstract—Online social networks (OSNs) are immensely popular, but their centralized control of user data raises important privacy concerns. This paper presents Vis-à-Vis, a decentralized framework for OSNs based on the privacy-preserving notion of a Virtual Individual Server (VIS). A VIS is a personal virtual machine running in a paid compute utility. In Vis-à-Vis, a person stores her data on her own VIS, which arbitrates access to that data by others. VISs self-organize into overlay networks corresponding to social groups. This paper focuses on preserving the privacy of location information. Vis-à-Vis uses distributed location trees to provide efficient and scalable operations for sharing location information within social groups. We have evaluated our Vis-à-Vis prototype using hundreds of virtual machines running in the Amazon EC2 compute utility. Our results demonstrate that Vis-à-Vis represents an attractive complement to today’s centralized OSNs. I.
The privacy jungle: On the market for data protection in social networks
- In The Eighth Workshop on the Economics of Information Security (WEIS 2009
, 2009
"... We have conducted the first thorough analysis of the market for privacy practices and policies in online social networks. From an evaluation of 45 social networking sites using 260 criteria we find that many popular assumptions regarding privacy and social networking need to be revisited when consid ..."
Abstract
-
Cited by 30 (2 self)
- Add to MetaCart
(Show Context)
We have conducted the first thorough analysis of the market for privacy practices and policies in online social networks. From an evaluation of 45 social networking sites using 260 criteria we find that many popular assumptions regarding privacy and social networking need to be revisited when considering the entire ecosystem instead of only a handful of well-known sites. Contrary to the common perception of an oligopolistic market, we find evidence of vigorous competition for new users. Despite observing many poor security practices, there is evidence that social network providers are making efforts to implement privacy enhancing technologies with substantial diversity in the amount of privacy control offered. However, privacy is rarely used as a selling point, even then only as auxiliary, non-decisive feature. Sites also failed to promote their existing privacy controls within the site. We similarly found great diversity in the length and content of formal privacy policies, but found an opposite promotional trend: though almost all policies are not accessible to ordinary users due to obfuscating legal jargon, they conspicuously vaunt the sites ’ privacy practices. We conclude that the market for privacy in social networks is dysfunctional in that there is significant variation in sites ’ privacy controls, data collection requirements, and legal privacy policies, but this is not
Privacy-Enabling Social Networking Over Untrusted Networks
- the Second ACM SIGCOMM Workshop on Social Network Systems (WOSN ’09
, 2009
"... Current social networks require users to place absolute faith in their operators, and the inability of operators to protect users from malicious agents has led to sensitive private information being made public. We propose an architecture for social networking that protects users ’ social informatio ..."
Abstract
-
Cited by 23 (3 self)
- Add to MetaCart
(Show Context)
Current social networks require users to place absolute faith in their operators, and the inability of operators to protect users from malicious agents has led to sensitive private information being made public. We propose an architecture for social networking that protects users ’ social information from both the operator and other network users. This architecture builds a social network out of smart clients and an untrusted central server in a way that removes the need for faith in network operators and gives users control of their privacy.
D.M.: Unfriendly: multi-party privacy risks in social networks
- In: Proceedings of the 10th International Conference on Privacy
, 2010
"... Abstract. As the popularity of social networks expands, the information users expose to the public has potentially dangerous implications for individual privacy. While social networks allow users to restrict access to their personal data, there is currently no mechanism to enforce privacy concerns o ..."
Abstract
-
Cited by 22 (0 self)
- Add to MetaCart
(Show Context)
Abstract. As the popularity of social networks expands, the information users expose to the public has potentially dangerous implications for individual privacy. While social networks allow users to restrict access to their personal data, there is currently no mechanism to enforce privacy concerns over content uploaded by other users. As group photos and stories are shared by friends and family, personal privacy goes beyond the discretion of what a user uploads about himself and becomes an issue of what every network participant reveals. In this paper, we examine how the lack of joint privacy controls over content can inadvertently reveal sensitive information about a user including preferences, relationships, conversations, and photos. Specifically, we analyze Facebook to identify scenarios where conflicting privacy settings between friends will reveal information that at least one user intended remain private. By aggregating the information exposed in this manner, we demonstrate how a user’s private attributes can be inferred from simply being listed as a friend or mentioned in a story. To mitigate this threat, we show how Facebook’s privacy model can be adapted to enforce multi-party privacy. We present a proof of concept application built into Facebook that automatically ensures mutually acceptable privacy restrictions are enforced on group content. 1