• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

Hinge-loss Markov random fields and probabilistic soft logic. arXiv:1505.04406 [cs.LG], (2015)

by S H Bach, M Broecheler, B Huang, L Getoor
Add To MetaCart

Tools

Sorted by:
Results 1 - 6 of 6

Latent topic networks: A versatile probabilistic programming framework for topic models.”

by James Foulds , Shachi H Kumar , Lise Getoor - In International Conference on Machine Learning, , 2015
"... Abstract Topic models have become increasingly prominent text-analytic machine learning tools for research in the social sciences and the humanities. In particular, custom topic models can be developed to answer specific research questions. The design of these models requires a nontrivial amount of ..."
Abstract - Cited by 2 (1 self) - Add to MetaCart
Abstract Topic models have become increasingly prominent text-analytic machine learning tools for research in the social sciences and the humanities. In particular, custom topic models can be developed to answer specific research questions. The design of these models requires a nontrivial amount of effort and expertise, motivating general-purpose topic modeling frameworks. In this paper we introduce latent topic networks, a flexible class of richly structured topic models designed to facilitate applied research. Custom models can straightforwardly be developed in our framework with an intuitive first-order logical probabilistic programming language. Latent topic networks admit scalable training via a parallelizable EM algorithm which leverages ADMM in the M-step. We demonstrate the broad applicability of the models with case studies on modeling influence in citation networks, and U.S. Presidential State of the Union addresses.
(Show Context)

Citation Context

...ral-purpose topic modeling frameworks have been proposed, some of which use probabilistic programming systems (Andrzejewski et al., 2011) and/or represent problem-specific domain knowledge via posterior constraints (Mei et al., 2014), but none of these frameworks satisfy all of our desiderata (Table 1). To address these limitations, this article introduces a flexible probabilistic programming framework for designing custom topic models. Using the framework, an analyst can specify models using a declarative first-order logical probabilistic programming language called probabilistic soft logic (Bach et al., 2015). The resulting models, which we refer to as latent topic networks, directly generalize LDA, but add prior structure, dependency relationships, and additional latent and observed variables, using a tractable class of graphical models called hinge-loss Markov random fields (HL-MRFs) (Bach et al., 2013). We show how to fit latent topic networks using an EM algorithm which is highly parallelizable without approximation, leveraging an alternating direction method of multipliers (ADMM) (Boyd et al., 2011) algorithm in the M-step. We demonstrate the system with several case studies, including a mode...

Joint prediction for entity/eventlevel sentiment analysis using probabilistic soft logic models

by Lingjia Deng , Janyce Wiebe - In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP
"... Abstract In this work, we build an entity/event-level sentiment analysis system, which is able to recognize and infer both explicit and implicit sentiments toward entities and events in the text. We design Probabilistic Soft Logic models that integrate explicit sentiments, inference rules, and +/-e ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
Abstract In this work, we build an entity/event-level sentiment analysis system, which is able to recognize and infer both explicit and implicit sentiments toward entities and events in the text. We design Probabilistic Soft Logic models that integrate explicit sentiments, inference rules, and +/-effect event information (events that positively or negatively affect entities). The experiments show that the method is able to greatly improve over baseline accuracies in recognizing entity/event-level sentiments.
(Show Context)

Citation Context

...ible and may not go through in context (Deng et al., 2014; Wiebe and Deng, 2014). work, we propose a more general set of inference rules and encode them in a probabilistic soft logic (PSL) framework (=-=Bach et al., 2015-=-). We chose PSL because it is designed to have efficient inference and, as similar methods in Statistical Relational Learning do, it allows probabilistic models to be specified in first-order logic, a...

Understanding Influence in Online Professional Networks

by Arti Ramesh, Mario Rodriguez, Lise Getoor
"... Social networks have become part and parcel of our lives. With social networks, users have access to tremendous amount of information that influence many as-pects of their lives such as daily activities, habits, and decisions. Recently, there has been a growing interest in understanding influence in ..."
Abstract - Add to MetaCart
Social networks have become part and parcel of our lives. With social networks, users have access to tremendous amount of information that influence many as-pects of their lives such as daily activities, habits, and decisions. Recently, there has been a growing interest in understanding influence in social networks. Previ-ous work in this area characterize influence as propagation of actions in the social network. However, typically only a single action type is considered in charac-terizing influence. In this paper, we present a holistic model to jointly represent different user actions and their respective propagations in the social network. Our model captures node features such as user seniority in the social network, and edge features such as connection strength to characterize influence. Our model is capable of representing and combining different kinds of information users as-similate in the social network and compute pairwise values of influence taking the different types of actions into account. We evaluate our models on data from LinkedIn and show the effectiveness of the inferred influence scores in predicting user actions. We further demonstrate that modeling different user actions, node and edge relationships between people leads to around 20 % increase in preci-sion at top k in predicting user actions, when compared to a model based only on General Threshold Model. 1
(Show Context)

Citation Context

...To represent and combine different heterogenous relationships between users, we propose a more powerful approach using HL-MRFs. HLMRFs are a scalable class of continuous, conditional graphical models =-=[2]-=-. HL-MRFs have achieved state-of-the-art performance in many domains including knowledge graph identification [14], understanding engagement in MOOCs [15], biomedicine and multi-relational link predic...

Paired-Dual Learning for Fast Training of Latent Variable Hinge-Loss MRFs: Appendices A. Probabilistic Soft Logic

by unknown authors
"... In this supplement, we describe the models used in our ex-periments using probabilistic soft logic (PSL) (Bach et al., 2015), a language for defining hinge-loss potential tem-plates. PSL’s variables are logical atoms, and its rules use logical operators such as conjunction and implication to define ..."
Abstract - Add to MetaCart
In this supplement, we describe the models used in our ex-periments using probabilistic soft logic (PSL) (Bach et al., 2015), a language for defining hinge-loss potential tem-plates. PSL’s variables are logical atoms, and its rules use logical operators such as conjunction and implication to define dependencies between these variables. All vari-ables are continuous in the [0, 1] interval. Conjunction of Boolean variablesX∧Y are generalized to continuous vari-ables using the hinge function max{X + Y − 1, 0}, which is known as the Lukasiewicz t-norm. Disjunction X ∨ Y is relaxed to min{X + Y, 1}, and negation ¬X is relaxed to 1 − X. To define a model, PSL rules are grounded out with all possible substitutions for logical terms. The groundings define hinge-loss potentials that share the same weight, and whose values are the ground rule’s distance to
(Show Context)

Citation Context

...g for Fast Training of Latent Variable Hinge-Loss MRFs: Appendices A. Probabilistic Soft Logic In this supplement, we describe the models used in our experiments using probabilistic soft logic (PSL) (=-=Bach et al., 2015-=-), a language for defining hinge-loss potential templates. PSL’s variables are logical atoms, and its rules use logical operators such as conjunction and implication to define dependencies between the...

Statistical Relational Learning with Soft Quantifiers

by Golnoosh Farnadi, Stephen H. Bach, Marjon Blondeel, Marie-francine Moens, Lise Getoor, Martine De Cock
"... Abstract. Quantification in statistical relational learning (SRL) is either existen-tial or universal, however humans might be more inclined to express knowledge using soft quantifiers, such as “most ” and “a few”. In this paper, we define the syntax and semantics of PSLQ, a new SRL framework that s ..."
Abstract - Add to MetaCart
Abstract. Quantification in statistical relational learning (SRL) is either existen-tial or universal, however humans might be more inclined to express knowledge using soft quantifiers, such as “most ” and “a few”. In this paper, we define the syntax and semantics of PSLQ, a new SRL framework that supports reasoning with soft quantifiers, and present its most probable explanation (MPE) inference algorithm. To the best of our knowledge, PSLQ is the first SRL framework that combines soft quantifiers with first-order logic rules for modeling uncertain rela-tional data. Our experimental results for link prediction in social trust networks demonstrate that the use of soft quantifiers not only allows for a natural and in-tuitive formulation of domain knowledge, but also improves the accuracy of in-ferred results. 1
(Show Context)

Citation Context

...ntage of smokers among Bob’s friends. This increase is not necessarily linear; in fact, a common approach to compute the truth degree of soft quantified expressions is to map percentages to the scale =-=[0, 1]-=- using nondecreasing piecewise linear functions [29]. Previous SRL work (e.g., [19, 14, 22]) has considered hard quantifiers with thresholds such as at least k. Soft quantifiers, on the other hand, do...

HyPER: A Flexible and Extensible Probabilistic Framework for Hybrid Recommender Systems

by Pigi Kouki, Shobeir Fakhraei, James Foulds, Magdalini Eirinaki, Lise Getoor
"... As the amount of recorded digital information increases, there is a growing need for flexible recommender systems which can incorporate richly structured data sources to im-prove recommendations. In this paper, we show how a re-cently introduced statistical relational learning framework can be used ..."
Abstract - Add to MetaCart
As the amount of recorded digital information increases, there is a growing need for flexible recommender systems which can incorporate richly structured data sources to im-prove recommendations. In this paper, we show how a re-cently introduced statistical relational learning framework can be used to develop a generic and extensible hybrid rec-ommender system. Our hybrid approach, HyPER (HY-brid Probabilistic Extensible Recommender), incorporates and reasons over a wide range of information sources. Such sources include multiple user-user and item-item similarity measures, content, and social information. HyPER auto-matically learns to balance these different information sig-nals when making predictions. We build our system using a powerful and intuitive probabilistic programming language called probabilistic soft logic [1], which enables efficient and accurate prediction by formulating our custom recommender systems with a scalable class of graphical models known as hinge-loss Markov random fields. We experimentally evalu-ate our approach on two popular recommendation datasets, showing that HyPER can effectively combine multiple in-formation types for improved performance, and can signifi-cantly outperform existing state-of-the-art approaches.
(Show Context)

Citation Context

...ly learns to balance these different information signals when making predictions. We build our system using a powerful and intuitive probabilistic programming language called probabilistic soft logic =-=[1]-=-, which enables efficient and accurate prediction by formulating our custom recommender systems with a scalable class of graphical models known as hinge-loss Markov random fields. We experimentally ev...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University