Results 1  10
of
194
Networkbased marketing: Identifying likely adopters via consumer networks
 Statistical Science
"... Abstract. Networkbased marketing refers to a collection of marketing techniques that take advantage of links between consumers to increase sales. We concentrate on the consumer networks formed using direct interactions (e.g., communications) between consumers. We survey the diverse literature on su ..."
Abstract

Cited by 68 (11 self)
 Add to MetaCart
Abstract. Networkbased marketing refers to a collection of marketing techniques that take advantage of links between consumers to increase sales. We concentrate on the consumer networks formed using direct interactions (e.g., communications) between consumers. We survey the diverse literature on such marketing with an emphasis on the statistical methods used and the data to which these methods have been applied. We also provide a discussion of challenges and opportunities for this burgeoning research topic. Our survey highlights a gap in the literature. Because of inadequate data, prior studies have not been able to provide direct, statistical support for the hypothesis that network linkage can directly affect product/service adoption. Using a new data set that represents the adoption of a new telecommunications service, we show very strong support for the hypothesis. Specifically, we show three main results: (1) “Network neighbors”—those consumers linked to a prior customer—adopt the service at a rate 3–5 times greater than baseline groups selected by the best practices of the firm’s marketing team. In addition, analyzing the network allows the firm to acquire new customers who otherwise would have fallen through the cracks, because they would not have been identified based on traditional attributes. (2) Statistical models, built with a very large amount of geographic, demographic and prior purchase data, are significantly and substantially improved by including network information. (3) More detailed network information allows the ranking of the network neighbors so as to permit the selection of small sets of individuals with very high probabilities of adoption. Key words and phrases: Viral marketing, word of mouth, targeted marketing, network analysis, classification, statistical relational learning. 1.
Efficient weight learning for Markov logic networks
 In Proceedings of the Eleventh European Conference on Principles and Practice of Knowledge Discovery in Databases
, 2007
"... Abstract. Markov logic networks (MLNs) combine Markov networks and firstorder logic, and are a powerful and increasingly popular representation for statistical relational learning. The stateoftheart method for discriminative learning of MLN weights is the voted perceptron algorithm, which is ess ..."
Abstract

Cited by 59 (7 self)
 Add to MetaCart
Abstract. Markov logic networks (MLNs) combine Markov networks and firstorder logic, and are a powerful and increasingly popular representation for statistical relational learning. The stateoftheart method for discriminative learning of MLN weights is the voted perceptron algorithm, which is essentially gradient descent with an MPE approximation to the expected sufficient statistics (true clause counts). Unfortunately, these can vary widely between clauses, causing the learning problem to be highly illconditioned, and making gradient descent very slow. In this paper, we explore several alternatives, from perweight learning rates to secondorder methods. In particular, we focus on two approaches that avoid computing the partition function: diagonal Newton and scaled conjugate gradient. In experiments on standard SRL datasets, we obtain orderofmagnitude speedups, or more accurate models given comparable learning times. 1
To join or not to join: the illusion of privacy in social networks with mixed public and private user profiles
 In WWW
, 2009
"... In order to address privacy concerns, many social media websites allow users to hide their personal profiles from the public. In this work, we show how an adversary can exploit an online social network with a mixture of public and private user profiles to predict the private attributes of users. We ..."
Abstract

Cited by 51 (3 self)
 Add to MetaCart
In order to address privacy concerns, many social media websites allow users to hide their personal profiles from the public. In this work, we show how an adversary can exploit an online social network with a mixture of public and private user profiles to predict the private attributes of users. We map this problem to a relational classification problem and we propose practical models that use friendship and group membership information (which is often not hidden) to infer sensitive attributes. The key novel idea is that in addition to friendship links, groups can be carriers of significant information. We show that on several wellknown social media sites, we can easily and accurately recover the information of privateprofile users. To the best of our knowledge, this is the first work that uses linkbased and groupbased classification to study privacy implications in social networks with mixed public and private user profiles.
Event modeling and recognition using markov logic networks
 IN ECCV
, 2008
"... We address the problem of visual event recognition in surveillance where noise and missing observations are serious problems. Common sense domain knowledge is exploited to overcome them. The knowledge is represented as firstorder logic production rules with associated weights to indicate their con ..."
Abstract

Cited by 43 (3 self)
 Add to MetaCart
We address the problem of visual event recognition in surveillance where noise and missing observations are serious problems. Common sense domain knowledge is exploited to overcome them. The knowledge is represented as firstorder logic production rules with associated weights to indicate their confidence. These rules are used in combination with a relaxed deduction algorithm to construct a network of grounded atoms, the Markov Logic Network. The network is used to perform probabilistic inference for input queries about events of interest. The system’s performance is demonstrated on a number of videos from a parking lot domain that contains complex interactions of people and vehicles.
Relational learning via latent social dimensions, in 'KDD '09
 Proceedings di of the 15th ACM SIGKDD international ti conference on Knowledge
, 2009
"... Social media such as blogs, Facebook, Flickr, etc., presents data in a network format rather than classical IID distribution. To address the interdependency among data instances, relational learning has been proposed, and collective inference based on network connectivity is adopted for prediction. ..."
Abstract

Cited by 40 (16 self)
 Add to MetaCart
Social media such as blogs, Facebook, Flickr, etc., presents data in a network format rather than classical IID distribution. To address the interdependency among data instances, relational learning has been proposed, and collective inference based on network connectivity is adopted for prediction. However, the connections in social media are often multidimensional. An actor can connect to another actor due to different factors, e.g., alumni, colleagues, living in the same city or sharing similar interest, etc. Collective inference normally does not differentiate these connections. In this work, we propose to extract latent social dimensions based on network information first, and then utilize them as features for discriminative learning. These social dimensions describe different affiliations of social actors hidden in the network, and the subsequent discriminative learning can automatically determine which affiliations are better aligned with the class labels. Such a scheme is preferred when multiple diverse relations are associated with the same network. We conduct extensive experiments on social media data (one from a realworld blog site and the other from a popular content sharing site). Our model outperforms representative relational learning methods based on collective inference, especially when few labeled data are available. The sensitivity of this model and its connection to existing methods are also carefully examined.
Statistical predicate invention
 In Z. Ghahramani (Ed.), Proceedings of the 24’th annual international conference on machine learning (ICML2007
, 2007
"... We propose statistical predicate invention as a key problem for statistical relational learning. SPI is the problem of discovering new concepts, properties and relations in structured data, and generalizes hidden variable discovery in statistical models and predicate invention in ILP. We propose an ..."
Abstract

Cited by 35 (10 self)
 Add to MetaCart
We propose statistical predicate invention as a key problem for statistical relational learning. SPI is the problem of discovering new concepts, properties and relations in structured data, and generalizes hidden variable discovery in statistical models and predicate invention in ILP. We propose an initial model for SPI based on secondorder Markov logic, in which predicates as well as arguments can be variables, and the domain of discourse is not fully known in advance. Our approach iteratively refines clusters of symbols based on the clusters of symbols they appear in atoms with (e.g., it clusters relations by the clusters of the objects they relate). Since different clusterings are better for predicting different subsets of the atoms, we allow multiple crosscutting clusterings. We show that this approach outperforms Markov logic structure learning and the recently introduced infinite relational model on a number of relational datasets. 1.
Learning Markov logic network structure via hypergraph lifting
 In Proceedings of the 26th International Conference on Machine Learning (ICML09
, 2009
"... Markov logic networks (MLNs) combine logic and probability by attaching weights to firstorder clauses, and viewing these as templates for features of Markov networks. Learning MLN structure from a relational database involves learning the clauses and weights. The stateoftheart MLN structure lear ..."
Abstract

Cited by 31 (3 self)
 Add to MetaCart
Markov logic networks (MLNs) combine logic and probability by attaching weights to firstorder clauses, and viewing these as templates for features of Markov networks. Learning MLN structure from a relational database involves learning the clauses and weights. The stateoftheart MLN structure learners all involve some element of greedily generating candidate clauses, and are susceptible to local optima. To address this problem, we present an approach that directly utilizes the data in constructing candidates. A relational database can be viewed as a hypergraph with constants as nodes and relations as hyperedges. We find paths of true ground atoms in the hypergraph that are connected via their arguments. To make this tractable (there are exponentially many paths in the hypergraph), we lift the hypergraph by jointly clustering the constants to form higherlevel concepts, and find paths in it. We variabilize the ground atoms in each path, and use them to form clauses, which are evaluated using a pseudolikelihood measure. In our experiments on three realworld datasets, we find that our algorithm outperforms the stateoftheart approaches. 1.
Counting belief propagation
 In Proc. UAI09
, 2009
"... A major benefit of graphical models is that most knowledge is captured in the model structure. Many models, however, produce inference problems with a lot of symmetries not reflected in the graphical structure and hence not exploitable by efficient inference techniques such as belief propagation (BP ..."
Abstract

Cited by 29 (13 self)
 Add to MetaCart
A major benefit of graphical models is that most knowledge is captured in the model structure. Many models, however, produce inference problems with a lot of symmetries not reflected in the graphical structure and hence not exploitable by efficient inference techniques such as belief propagation (BP). In this paper, we present a new and simple BP algorithm, called counting BP, that exploits such additional symmetries. Starting from a given factor graph, counting BP first constructs a compressed factor graph of clusternodes and clusterfactors, corresponding to sets of nodes and factors that are indistinguishable given the evidence. Then it runs a modified BP algorithm on the compressed graph that is equivalent to running BP on the original factor graph. Our experiments show that counting BP is applicable to a variety of important AI tasks such as (dynamic) relational models and boolean model counting, and that significant efficiency gains are obtainable, often by orders of magnitude. 1
Hybrid Markov Logic Networks
"... Markov logic networks (MLNs) combine firstorder logic and Markov networks, allowing us to handle the complexity and uncertainty of realworld problems in a single consistent framework. However, in MLNs all variables and features are discrete, while most realworld applications also contain continuo ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
Markov logic networks (MLNs) combine firstorder logic and Markov networks, allowing us to handle the complexity and uncertainty of realworld problems in a single consistent framework. However, in MLNs all variables and features are discrete, while most realworld applications also contain continuous ones. In this paper we introduce hybrid MLNs, in which continuous properties (e.g., the distance between two objects) and functions over them can appear as features. Hybrid MLNs have all distributions in the exponential family as special cases (e.g., multivariate Gaussians), and allow much more compact modeling of noni.i.d. data than propositional representations like hybrid Bayesian networks. We also introduce inference algorithms for hybrid MLNs, by extending the MaxWalkSAT and MCSAT algorithms to continuous domains. Experiments in a mobile robot mapping domain—involving joint classification, clustering and regression—illustrate the power of hybrid MLNs as a modeling language, and the accuracy and efficiency of the inference algorithms.