Results 1 
6 of
6
Social resilience in online communities: The autopsy of friendster
 In Proceedings of the First ACM Conference on Online Social Networks, COSN ’13
, 2013
"... We empirically analyze five online communities: Friendster, Livejournal, Facebook, Orkut, Myspace, to identify causes for the decline of social networks. We define social resilience as the ability of a community to withstand changes. We do not argue about the cause of such changes, but concentrate o ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
We empirically analyze five online communities: Friendster, Livejournal, Facebook, Orkut, Myspace, to identify causes for the decline of social networks. We define social resilience as the ability of a community to withstand changes. We do not argue about the cause of such changes, but concentrate on their impact. Changes may cause users to leave, which may trigger further leaves of others who lost connection to their friends. This may lead to cascades of users leaving. A social network is said to be resilient if the size of such cascades can be limited. To quantify resilience, we use the kcore analysis, to identify subsets of the network in which all users have at least k friends. These connections generate benefits (b) for each user, which have to outweigh the costs (c) of being a member of the network. If this difference is not positive, users leave. After all cascades, the remaining network is the kcore of the original network determined by the costtobenefit (c/b) ratio. By analysing the cumulative distribution of kcores we are able to calculate the number of users remaining in each community. This allows us to infer the impact of the c/b ratio on the resilience of these online communities. We find that the different online communities have different kcore distributions. Consequently, similar changes in the c/b ratio have a different impact on the amount of active users. As a case study, we focus on the evolution of Friendster. We identify time periods when new users entering the network observed an insufficient c/b ratio. This measure can be seen as a precursor of the later collapse of the community. Our analysis can be applied to estimate the impact of changes in the user interface, which may temporarily increase the c/b ratio, thus posing a threat for the community to shrink, or even to collapse. 1
Preventing unraveling in social networks gets harder
 in AAAI ’13
, 2013
"... The behavior of users in social networks is often observed to be affected by the actions of their friends. Bhawalkar et al. [2] introduced a formal mathematical model for user engagement in social networks where each individual derives a benefit proportional to the number of its friends which are en ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
The behavior of users in social networks is often observed to be affected by the actions of their friends. Bhawalkar et al. [2] introduced a formal mathematical model for user engagement in social networks where each individual derives a benefit proportional to the number of its friends which are engaged. Given a threshold degree k the equilibrium for this model is a maximal subgraph whose minimum degree is ≥ k. However the dropping out of individuals with degrees less than k might lead to a cascading effect of iterated withdrawals such that the size of equilibrium subgraph becomes very small. To overcome this some special vertices called “anchors ” are introduced: these vertices need not have large degree. Bhawalkar et al. [2] considered the ANCHORED kCORE problem: Given a graph G and integers b,k and p do there exist a set of vertices B ⊆ H ⊆ V (G) such that B  ≤ b, H  ≥ p and every vertex v ∈ H \B has degree at least k is the induced subgraph G[H]. They showed that the problem is NPhard for k ≥ 2 and gave some inapproximability and fixedparameter intractability results. In this paper we give improved hardness results for this problem. In particular we show that the ANCHORED kCORE problem is W[1]hard parameterized by p, even for k = 3. This improves the result of Bhawalkar et al. [2] (who show W[2]hardness parameterized by b) as our parameter is always bigger since p ≥ b. Then we answer a question of Bhawalkar et al. [2] by showing that the ANCHORED kCORE problem remains NPhard on planar graphs for all k ≥ 3, even if the maximum degree of the graph is k+2. Finally we show that the problem is FPT on planar graphs parameterized by b for all k ≥ 7. 1
Parameterized Complexity of the Anchored kCore Problem for Directed Graphs
 In FSTTCS
, 2013
"... Motivated by the study of unraveling processes in social networks, Bhawalkar, Kleinberg, Lewi, Roughgarden, and Sharma [ICALP 2012] introduced the Anchored kCore problem, where the task is for a given graph G and integers b, k, and p to find an induced subgraph H with at least p vertices (the core) ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Motivated by the study of unraveling processes in social networks, Bhawalkar, Kleinberg, Lewi, Roughgarden, and Sharma [ICALP 2012] introduced the Anchored kCore problem, where the task is for a given graph G and integers b, k, and p to find an induced subgraph H with at least p vertices (the core) such that all but at most b vertices (called anchors) of H are of degree at least k. In this paper, we extend the notion of kcore to directed graphs and provide a number of new algorithmic and complexity results for the directed version of the problem. We show that • The decision version of the problem is NPcomplete for every k ≥ 1 even if the input graph is restricted to be a planar directed acyclic graph of maximum degree at most k + 2. • The problem is fixed parameter tractable (FPT) parameterized by the size of the core p for k = 1, and W[1]hard for k ≥ 2. • When the maximum degree of the graph is at most ∆, the problem is FPT parameterized by p+ ∆ if k ≥ ∆2. 1
OLAK: An Efficient Algorithm to Prevent Unraveling in Social Networks
"... ABSTRACT In this paper, we study the problem of the anchored kcore. Given a graph G, an integer k and a budget b, we aim to identify b vertices in G so that we can determine the largest induced subgraph J in which every vertex, except the b vertices, has at least k neighbors in J. This problem was ..."
Abstract
 Add to MetaCart
(Show Context)
ABSTRACT In this paper, we study the problem of the anchored kcore. Given a graph G, an integer k and a budget b, we aim to identify b vertices in G so that we can determine the largest induced subgraph J in which every vertex, except the b vertices, has at least k neighbors in J. This problem was introduced by Bhawalkar and Kleinberg et al. in the context of user engagement in social networks, where a user may leave a community if he/she has less than k friends engaged. The problem has been shown to be NPhard and inapproximable. A polynomialtime algorithm for graphs with bounded treewidth has been proposed. However, this assumption usually does not hold in reallife graphs, and their techniques cannot be extended to handle general graphs. Motivated by this, we propose an efficient algorithm, namely onionlayer based anchored kcore (OLAK), for the anchored kcore problem on large scale graphs. To facilitate computation of the anchored kcore, we design an onion layer structure, which is generated by a simple onionpeelinglike algorithm against a small set of vertices in the graph. We show that computation of the best anchor can simply be conducted upon the vertices on the onion layers, which significantly reduces the search space. Based on the wellorganized layer structure, we develop efficient candidates exploration, early termination and pruning techniques to further speed up computation. Comprehensive experiments on 10 reallife graphs demonstrate the effectiveness and efficiency of our proposed methods.
Fixed Points of Graph Peeling
, 2013
"... Degree peeling is used to study complex networks. It corresponds to a decomposition of the graph into vertex groups of increasing minimum degree. However, the peeling value of a vertex is nonlocal in this context since it relies on the connections the vertex has to groups above it. We explore a di ..."
Abstract
 Add to MetaCart
(Show Context)
Degree peeling is used to study complex networks. It corresponds to a decomposition of the graph into vertex groups of increasing minimum degree. However, the peeling value of a vertex is nonlocal in this context since it relies on the connections the vertex has to groups above it. We explore a different way to decompose a network into edge layers such that the local peeling value of the vertices on each layer does not depend on their nonlocal connections with the other layers. This corresponds to the decomposition of a graph into subgraphs that are invariant with respect to degree peeling, i.e. they are fixed points. We introduce in this context a method to partition the edges of a graph into fixed points of degree peeling, called the iterativeedgecore decomposition. Information from this decomposition is used to formulate a notion of vertex diversity based on Shannon’s entropy. We illustrate the usefulness of this decomposition in social network analysis. Our method can be used for community detection and graph visualization.
Hardness of Peeling with Stashes
, 2014
"... The analysis of several algorithms and data structures can be framed as a peeling process on a random hypergraph: vertices with degree less than k and their adjacent edges are removed until no vertices of degree less than k are left. Often the question is whether the remaining hypergraph, the kcore ..."
Abstract
 Add to MetaCart
(Show Context)
The analysis of several algorithms and data structures can be framed as a peeling process on a random hypergraph: vertices with degree less than k and their adjacent edges are removed until no vertices of degree less than k are left. Often the question is whether the remaining hypergraph, the kcore, is empty or not. In some settings, it may be possible to remove either vertices or edges from the hypergraph before peeling, at some cost. For example, in hashing applications where keys correspond to edges and buckets to vertices, one might use an additional side data structure, commonly referred to as a stash, to separately handle some keys in order to avoid collisions. The natural question in such cases is to find the minimum number of edges (or vertices) that need to be stashed in order to realize an empty kcore. We show that both these problems are NPcomplete for all k ≥ 2 on graphs and regular hypergraphs, with the sole exception being that the edge variant of stashing is solvable in polynomial time for k = 2 on standard (2uniform) graphs.