Results 1 -
6 of
6
Latent topic networks: A versatile probabilistic programming framework for topic models.”
- In International Conference on Machine Learning,
, 2015
"... Abstract Topic models have become increasingly prominent text-analytic machine learning tools for research in the social sciences and the humanities. In particular, custom topic models can be developed to answer specific research questions. The design of these models requires a nontrivial amount of ..."
Abstract
-
Cited by 2 (1 self)
- Add to MetaCart
(Show Context)
Abstract Topic models have become increasingly prominent text-analytic machine learning tools for research in the social sciences and the humanities. In particular, custom topic models can be developed to answer specific research questions. The design of these models requires a nontrivial amount of effort and expertise, motivating general-purpose topic modeling frameworks. In this paper we introduce latent topic networks, a flexible class of richly structured topic models designed to facilitate applied research. Custom models can straightforwardly be developed in our framework with an intuitive first-order logical probabilistic programming language. Latent topic networks admit scalable training via a parallelizable EM algorithm which leverages ADMM in the M-step. We demonstrate the broad applicability of the models with case studies on modeling influence in citation networks, and U.S. Presidential State of the Union addresses.
Joint prediction for entity/eventlevel sentiment analysis using probabilistic soft logic models
- In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP
"... Abstract In this work, we build an entity/event-level sentiment analysis system, which is able to recognize and infer both explicit and implicit sentiments toward entities and events in the text. We design Probabilistic Soft Logic models that integrate explicit sentiments, inference rules, and +/-e ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
Abstract In this work, we build an entity/event-level sentiment analysis system, which is able to recognize and infer both explicit and implicit sentiments toward entities and events in the text. We design Probabilistic Soft Logic models that integrate explicit sentiments, inference rules, and +/-effect event information (events that positively or negatively affect entities). The experiments show that the method is able to greatly improve over baseline accuracies in recognizing entity/event-level sentiments.
Understanding Influence in Online Professional Networks
"... Social networks have become part and parcel of our lives. With social networks, users have access to tremendous amount of information that influence many as-pects of their lives such as daily activities, habits, and decisions. Recently, there has been a growing interest in understanding influence in ..."
Abstract
- Add to MetaCart
(Show Context)
Social networks have become part and parcel of our lives. With social networks, users have access to tremendous amount of information that influence many as-pects of their lives such as daily activities, habits, and decisions. Recently, there has been a growing interest in understanding influence in social networks. Previ-ous work in this area characterize influence as propagation of actions in the social network. However, typically only a single action type is considered in charac-terizing influence. In this paper, we present a holistic model to jointly represent different user actions and their respective propagations in the social network. Our model captures node features such as user seniority in the social network, and edge features such as connection strength to characterize influence. Our model is capable of representing and combining different kinds of information users as-similate in the social network and compute pairwise values of influence taking the different types of actions into account. We evaluate our models on data from LinkedIn and show the effectiveness of the inferred influence scores in predicting user actions. We further demonstrate that modeling different user actions, node and edge relationships between people leads to around 20 % increase in preci-sion at top k in predicting user actions, when compared to a model based only on General Threshold Model. 1
Paired-Dual Learning for Fast Training of Latent Variable Hinge-Loss MRFs: Appendices A. Probabilistic Soft Logic
"... In this supplement, we describe the models used in our ex-periments using probabilistic soft logic (PSL) (Bach et al., 2015), a language for defining hinge-loss potential tem-plates. PSL’s variables are logical atoms, and its rules use logical operators such as conjunction and implication to define ..."
Abstract
- Add to MetaCart
(Show Context)
In this supplement, we describe the models used in our ex-periments using probabilistic soft logic (PSL) (Bach et al., 2015), a language for defining hinge-loss potential tem-plates. PSL’s variables are logical atoms, and its rules use logical operators such as conjunction and implication to define dependencies between these variables. All vari-ables are continuous in the [0, 1] interval. Conjunction of Boolean variablesX∧Y are generalized to continuous vari-ables using the hinge function max{X + Y − 1, 0}, which is known as the Lukasiewicz t-norm. Disjunction X ∨ Y is relaxed to min{X + Y, 1}, and negation ¬X is relaxed to 1 − X. To define a model, PSL rules are grounded out with all possible substitutions for logical terms. The groundings define hinge-loss potentials that share the same weight, and whose values are the ground rule’s distance to
Statistical Relational Learning with Soft Quantifiers
"... Abstract. Quantification in statistical relational learning (SRL) is either existen-tial or universal, however humans might be more inclined to express knowledge using soft quantifiers, such as “most ” and “a few”. In this paper, we define the syntax and semantics of PSLQ, a new SRL framework that s ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract. Quantification in statistical relational learning (SRL) is either existen-tial or universal, however humans might be more inclined to express knowledge using soft quantifiers, such as “most ” and “a few”. In this paper, we define the syntax and semantics of PSLQ, a new SRL framework that supports reasoning with soft quantifiers, and present its most probable explanation (MPE) inference algorithm. To the best of our knowledge, PSLQ is the first SRL framework that combines soft quantifiers with first-order logic rules for modeling uncertain rela-tional data. Our experimental results for link prediction in social trust networks demonstrate that the use of soft quantifiers not only allows for a natural and in-tuitive formulation of domain knowledge, but also improves the accuracy of in-ferred results. 1
HyPER: A Flexible and Extensible Probabilistic Framework for Hybrid Recommender Systems
"... As the amount of recorded digital information increases, there is a growing need for flexible recommender systems which can incorporate richly structured data sources to im-prove recommendations. In this paper, we show how a re-cently introduced statistical relational learning framework can be used ..."
Abstract
- Add to MetaCart
(Show Context)
As the amount of recorded digital information increases, there is a growing need for flexible recommender systems which can incorporate richly structured data sources to im-prove recommendations. In this paper, we show how a re-cently introduced statistical relational learning framework can be used to develop a generic and extensible hybrid rec-ommender system. Our hybrid approach, HyPER (HY-brid Probabilistic Extensible Recommender), incorporates and reasons over a wide range of information sources. Such sources include multiple user-user and item-item similarity measures, content, and social information. HyPER auto-matically learns to balance these different information sig-nals when making predictions. We build our system using a powerful and intuitive probabilistic programming language called probabilistic soft logic [1], which enables efficient and accurate prediction by formulating our custom recommender systems with a scalable class of graphical models known as hinge-loss Markov random fields. We experimentally evalu-ate our approach on two popular recommendation datasets, showing that HyPER can effectively combine multiple in-formation types for improved performance, and can signifi-cantly outperform existing state-of-the-art approaches.