Results 1 -
5 of
5
Modeling Learner Engagement in MOOCs using Probabilistic Soft Logic
"... Massive open online courses (MOOCs) attract a large number of student registra-tions, but recent studies have shown that only a small fraction of these students complete their courses. Student dropouts are thus a major deterrent for the growth and success of MOOCs. We believe that understanding stud ..."
Abstract
-
Cited by 8 (2 self)
- Add to MetaCart
(Show Context)
Massive open online courses (MOOCs) attract a large number of student registra-tions, but recent studies have shown that only a small fraction of these students complete their courses. Student dropouts are thus a major deterrent for the growth and success of MOOCs. We believe that understanding student engagement as a course progresses is essential for minimizing dropout rates. Formally defining student engagement in an online setting is challenging. In this paper, we leverage activity (such as posting in discussion forums, timely submission of assignments, etc.), linguistic features from forum content and structural features from forum interaction to identify two different forms of student engagement (passive and ac-tive) in MOOCs. We use probabilistic soft logic (PSL) to model student engage-ment by capturing domain knowledge about student interactions and performance. We test our models on MOOC data from Coursera and demonstrate that modeling engagement is helpful in predicting student performance. 1
Paired-dual learning for fast training of latent variable hinge-loss mrfs
- In Proceedings of the International Conference of Machine Learning
, 2015
"... Latent variables allow probabilistic graphical models to capture nuance and structure in im-portant domains such as network science, natural language processing, and computer vision. Naive approaches to learning such complex models can be prohibitively expensive—because they require repeated inferen ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
Latent variables allow probabilistic graphical models to capture nuance and structure in im-portant domains such as network science, natural language processing, and computer vision. Naive approaches to learning such complex models can be prohibitively expensive—because they require repeated inferences to update beliefs about la-tent variables—so lifting this restriction for use-ful classes of models is an important problem. Hinge-loss Markov random fields (HL-MRFs) are graphical models that allow highly scalable inference and learning in structured domains, in part by representing structured problems with continuous variables. However, this representa-tion leads to challenges when learning with la-tent variables. We introduce paired-dual learn-ing, a framework that greatly speeds up training by using tractable entropy surrogates and avoid-ing repeated inferences. Paired-dual learning op-timizes an objective with a pair of dual inference problems. This allows fast, joint optimization of parameters and dual variables. We evaluate on social-group detection, trust prediction in social networks, and image reconstruction, finding that paired-dual learning trains models as accurate as those trained by traditional methods in much less time, often before traditional methods make even a single parameter update. 1.
Joint prediction for entity/eventlevel sentiment analysis using probabilistic soft logic models
- In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP
"... Abstract In this work, we build an entity/event-level sentiment analysis system, which is able to recognize and infer both explicit and implicit sentiments toward entities and events in the text. We design Probabilistic Soft Logic models that integrate explicit sentiments, inference rules, and +/-e ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
Abstract In this work, we build an entity/event-level sentiment analysis system, which is able to recognize and infer both explicit and implicit sentiments toward entities and events in the text. We design Probabilistic Soft Logic models that integrate explicit sentiments, inference rules, and +/-effect event information (events that positively or negatively affect entities). The experiments show that the method is able to greatly improve over baseline accuracies in recognizing entity/event-level sentiments.
Probabilistic Soft Logic for Social Good
"... As governments, non-profit organizations, researchers, and corporations collect data on social phenomena, opportuni-ties have emerged for data science applications that can benefit society. However, modeling these types of complex, real-world phenomena requires new tools to address inher-ent computa ..."
Abstract
- Add to MetaCart
(Show Context)
As governments, non-profit organizations, researchers, and corporations collect data on social phenomena, opportuni-ties have emerged for data science applications that can benefit society. However, modeling these types of complex, real-world phenomena requires new tools to address inher-ent computational challenges. Social data is intrinsically relational, noisy, partially observed, and large scale, and it is composed of both continuous and discrete informa-tion. Probabilistic soft logic (PSL) [3, 5] is a general-purpose framework we are developing to solve these challenges. Since the value of social data is in the networks of re-lationships they describe, models for social data should be rich enough to capture the intricate dependency structures among unknown variables. These models should be proba-bilistic so that they are robust to the inconsistencies caused
Paired-Dual Learning for Fast Training of Latent Variable Hinge-Loss MRFs: Appendices A. Probabilistic Soft Logic
"... In this supplement, we describe the models used in our ex-periments using probabilistic soft logic (PSL) (Bach et al., 2015), a language for defining hinge-loss potential tem-plates. PSL’s variables are logical atoms, and its rules use logical operators such as conjunction and implication to define ..."
Abstract
- Add to MetaCart
In this supplement, we describe the models used in our ex-periments using probabilistic soft logic (PSL) (Bach et al., 2015), a language for defining hinge-loss potential tem-plates. PSL’s variables are logical atoms, and its rules use logical operators such as conjunction and implication to define dependencies between these variables. All vari-ables are continuous in the [0, 1] interval. Conjunction of Boolean variablesX∧Y are generalized to continuous vari-ables using the hinge function max{X + Y − 1, 0}, which is known as the Lukasiewicz t-norm. Disjunction X ∨ Y is relaxed to min{X + Y, 1}, and negation ¬X is relaxed to 1 − X. To define a model, PSL rules are grounded out with all possible substitutions for logical terms. The groundings define hinge-loss potentials that share the same weight, and whose values are the ground rule’s distance to