Results 1 -
6 of
6
Scalable Recommendation with Poisson Factorization
, 1311
"... We develop a Bayesian Poisson matrix factorization model for forming recommendations from sparse user behavior data. These data are large user/item matrices where each user has provided feedback on only a small subset of items, either explicitly (e.g., through star ratings) or implicitly (e.g., thro ..."
Abstract
-
Cited by 17 (4 self)
- Add to MetaCart
(Show Context)
We develop a Bayesian Poisson matrix factorization model for forming recommendations from sparse user behavior data. These data are large user/item matrices where each user has provided feedback on only a small subset of items, either explicitly (e.g., through star ratings) or implicitly (e.g., through views or purchases). In contrast to traditional matrix factorization approaches, Poisson factorization implicitly models each user’s limited attention to consume items. Moreover, because of the mathematical form of the Poisson likelihood, the model needs only to explicitly consider the observed entries in the matrix, leading to both scalable computation and good predictive performance. We develop a variational inference algorithm for approximate posterior inference that scales up to massive data sets. This is an efficient algorithm that iterates over the observed entries and adjusts an approximate posterior over the user/item representations. We apply our method to large real-world user data containing users rating movies, users listening to songs, and users reading scientific papers. In all these settings, Bayesian Poisson factorization outperforms state-of-the-art matrix factorization methods. 1.
Dynamic rank factor model for text streams
- In Proc. of NIPS
, 2014
"... We propose a semi-parametric and dynamic rank factor model for topic model-ing, capable of (i) discovering topic prevalence over time, and (ii) learning con-temporary multi-scale dependence structures, providing topic and word correla-tions as a byproduct. The high-dimensional and time-evolving ordi ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
(Show Context)
We propose a semi-parametric and dynamic rank factor model for topic model-ing, capable of (i) discovering topic prevalence over time, and (ii) learning con-temporary multi-scale dependence structures, providing topic and word correla-tions as a byproduct. The high-dimensional and time-evolving ordinal/rank ob-servations (such as word counts), after an arbitrary monotone transformation, are well accommodated through an underlying dynamic sparse factor model. The framework naturally admits heavy-tailed innovations, capable of inferring abrupt temporal jumps in the importance of topics. Posterior inference is performed through straightforward Gibbs sampling, based on the forward-filtering backward-sampling algorithm. Moreover, an efficient data subsampling scheme is leveraged to speed up inference on massive datasets. The modeling framework is illustrated
Scalable Recommendation with Hierarchical Poisson Factorization
"... We develop hierarchical Poisson matrix factor-ization (HPF), a novel method for providing users with high quality recommendations based on implicit feedback, such as views, clicks, or purchases. In contrast to existing recommen-dation models, HPF has a number of desirable properties. First, we show ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
We develop hierarchical Poisson matrix factor-ization (HPF), a novel method for providing users with high quality recommendations based on implicit feedback, such as views, clicks, or purchases. In contrast to existing recommen-dation models, HPF has a number of desirable properties. First, we show that HPF more accu-rately captures the long-tailed user activity found in most consumption data by explicitly consider-ing the fact that users have finite attention bud-gets. This leads to better estimates of users ’ la-tent preferences, and therefore superior recom-mendations, compared to competing methods. Second, HPF learns these latent factors by only explicitly considering positive examples, elimi-nating the often costly step of generating arti-ficial negative examples when fitting to implicit data. Third, HPF is more than just one method— it is the simplest in a class of probabilistic models with these properties, and can easily be extended to include more complex structure and assump-tions. We develop a variational algorithm for ap-proximate posterior inference for HPF that scales up to large data sets, and we demonstrate its per-formance on a wide variety of real-world recom-mendation problems—users rating movies, lis-tening to songs, reading scientific papers, and reading news articles. 1
Capturing Semantically Meaningful Word Dependencies with an Admixture of Poisson MRFs
"... We develop a fast algorithm for the Admixture of Poisson MRFs (APM) topic model [1] and propose a novel metric to directly evaluate this model. The APM topic model recently introduced by Inouye et al. [1] is the first topic model that allows for word dependencies within each topic unlike in previous ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
We develop a fast algorithm for the Admixture of Poisson MRFs (APM) topic model [1] and propose a novel metric to directly evaluate this model. The APM topic model recently introduced by Inouye et al. [1] is the first topic model that allows for word dependencies within each topic unlike in previous topic models like LDA that assume independence between words within a topic. Research in both the semantic coherence of a topic models [2, 3, 4, 5] and measures of model fitness [6] provide strong support that explicitly modeling word dependencies—as in APM—could be both semantically meaningful and essential for appropriately modeling real text data. Though APM shows significant promise for providing a better topic model, APM has a high computational complexity because O(p2) parameters must be estimated where p is the number of words ([1] could only provide results for datasets with p = 200). In light of this, we develop a paral-lel alternating Newton-like algorithm for training the APM model that can han-dle p = 104 as an important step towards scaling to large datasets. In addition, Inouye et al. [1] only provided tentative and inconclusive results on the utility of APM. Thus, motivated by simple intuitions and previous evaluations of topic models, we propose a novel evaluation metric based on human evocation scores between word pairs (i.e. how much one word “brings to mind ” another word [7]). We provide compelling quantitative and qualitative results on the BNC corpus that demonstrate the superiority of APM over previous topic models for identi-fying semantically meaningful word dependencies. (MATLAB code available at:
Square Root Graphical Models: Multivariate Generalizations of Univariate Exponential Families that Permit Positive Dependencies
"... Abstract We develop Square Root Graphical Models (SQR), a novel class of parametric graphical models that provides multivariate generalizations of univariate exponential family distributions. Previous multivariate graphical models ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract We develop Square Root Graphical Models (SQR), a novel class of parametric graphical models that provides multivariate generalizations of univariate exponential family distributions. Previous multivariate graphical models
Fixed-Length Poisson MRF: Adding Dependencies to the Multinomial
"... Abstract We propose a novel distribution that generalizes the Multinomial distribution to enable dependencies between dimensions. Our novel distribution is based on the parametric form of the Poisson MRF model ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract We propose a novel distribution that generalizes the Multinomial distribution to enable dependencies between dimensions. Our novel distribution is based on the parametric form of the Poisson MRF model