• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 12
Next 10 →

Online Learning for Latent Dirichlet Allocation

by Matthew D. Hoffman, David M. Blei, Francis Bach
"... We develop an online variational Bayes (VB) algorithm for Latent Dirichlet Allocation (LDA). Online LDA is based on online stochastic optimization with a natural gradient step, which we show converges to a local optimum of the VB objective function. It can handily analyze massive document collection ..."
Abstract - Cited by 209 (21 self) - Add to MetaCart
collections, including those arriving in a stream. We study the performance of online LDA in several ways, including by fitting a 100-topic topic model to 3.3M articles from Wikipedia in a single pass. We demonstrate that online LDA finds topic models as good or better than those found with batch VB, and in a

Larger Residuals, Less Work: Active Document Scheduling for Latent Dirichlet Allocation

by Mirwaes Wahabzada, Kristian Kersting
"... Abstract. Recently, there have been considerable advances in fast inference for latent Dirichlet allocation (LDA). In particular, stochastic optimization of the variational Bayes (VB) objective function with a natural gradient step was proved to converge and able to process massive document collecti ..."
Abstract - Cited by 12 (1 self) - Add to MetaCart
. On several real-world datasets, including 3M articles from Wikipedia, we demonstrate that residual LDA can handily analyze massive document collections and find topic models as good or better than those found with batch VB and randomly scheduled VB, and significantly faster. 1

Streaming variational bayes

by Tamara Broderick, Nicholas Boyd, Andre Wibisono, Ashia C. Wilson, Michael I. Jordan - In Neural Information Processing Systems (NIPS , 2013
"... We present SDA-Bayes, a framework for (S)treaming, (D)istributed, (A)synchronous computation of a Bayesian posterior. The framework makes streaming updates to the estimated posterior according to a user-specified approx-imation batch primitive. We demonstrate the usefulness of our framework, with va ..."
Abstract - Cited by 31 (0 self) - Add to MetaCart
We present SDA-Bayes, a framework for (S)treaming, (D)istributed, (A)synchronous computation of a Bayesian posterior. The framework makes streaming updates to the estimated posterior according to a user-specified approx-imation batch primitive. We demonstrate the usefulness of our framework

Preliminary Trials on the Effect of Lighting for the Population Growth of the Rotifer, Brachionus plicatilis

by Takao Yoshimatsu, Takahiro Higuchi, Yuji Hamasaki, Kenji Tanaka
"... The effect of lighting to rotifer culture water was investigated preliminarily using freshwater Chlorella as food in terms of vitamin B12 (VB12) production. The marine rotifer Brachionus plicatilis was batch-cultured for five days at 25°C in 25 psu seawater with or without lighting (L:D = 13:11 and ..."
Abstract - Add to MetaCart
The effect of lighting to rotifer culture water was investigated preliminarily using freshwater Chlorella as food in terms of vitamin B12 (VB12) production. The marine rotifer Brachionus plicatilis was batch-cultured for five days at 25°C in 25 psu seawater with or without lighting (L:D = 13

ONLINE BAYESIAN DICTIONARY LEARNING FOR LARGE DATASETS

by Lingbo Li, Jorge Silva, Mingyuan Zhou, Lawrence Carin
"... The problem of learning a data-adaptive dictionary for a very large collection of signals is addressed. This paper proposes a statistical model and associated variational Bayesian (VB) inference for simultaneously learning the dictionary and performing sparse coding of the signals. The model builds ..."
Abstract - Cited by 4 (2 self) - Add to MetaCart
The problem of learning a data-adaptive dictionary for a very large collection of signals is addressed. This paper proposes a statistical model and associated variational Bayesian (VB) inference for simultaneously learning the dictionary and performing sparse coding of the signals. The model builds

A Filtering Approach to Stochastic Variational Inference

by Neil M. T. Houlsby, David M. Blei
"... Stochastic variational inference (SVI) uses stochastic optimization to scale up Bayesian computation to massive data. We present an alternative perspective on SVI as approximate parallel coordinate ascent. SVI trades-off bias and variance to step close to the unknown true coordinate optimum given by ..."
Abstract - Add to MetaCart
by batch variational Bayes (VB). We define a model to automate this process. The model infers the lo-cation of the next VB optimum from a sequence of noisy realizations. As a conse-quence of this construction, we update the variational parameters using Bayes rule, rather than a hand-crafted optimization

Scalable Bayesian Non-Negative Tensor Factorization for Massive Count Data

by unknown authors
"... Abstract. We present a Bayesian non-negative tensor factorization model for count-valued tensor data, and develop scalable inference algorithms (both batch and online) for dealing with massive tensors. Our generative model can handle overdispersed counts as well as infer the rank of the decompositio ..."
Abstract - Add to MetaCart
Abstract. We present a Bayesian non-negative tensor factorization model for count-valued tensor data, and develop scalable inference algorithms (both batch and online) for dealing with massive tensors. Our generative model can handle overdispersed counts as well as infer the rank

Scalable Bayesian Non-Negative Tensor Factorization for Massive Count Data

by unknown authors
"... Abstract. We present a Bayesian non-negative tensor factorization model for count-valued tensor data, and develop scalable inference algorithms (both batch and online) for dealing with massive tensors. Our generative model can handle overdispersed counts as well as infer the rank of the decompositio ..."
Abstract - Add to MetaCart
Abstract. We present a Bayesian non-negative tensor factorization model for count-valued tensor data, and develop scalable inference algorithms (both batch and online) for dealing with massive tensors. Our generative model can handle overdispersed counts as well as infer the rank

LETTER Communicated by Benjamin Schrauwen Regularized Variational Bayesian Learning of Echo State Networks with Delay&Sum Readout

by Dmitriy Shutin , Christoph Zechner , Sanjeev R Kulkarni , H Vincent Poor , D Shutin , C Zechner , S Kulkarni , H Poor
"... In this work, a variational Bayesian framework for efficient training of echo state networks (ESNs) with automatic regularization and delay&sum (D&S) readout adaptation is proposed. The algorithm uses a classical batch learning of ESNs. By treating the network echo states as fixed basis fun ..."
Abstract - Add to MetaCart
In this work, a variational Bayesian framework for efficient training of echo state networks (ESNs) with automatic regularization and delay&sum (D&S) readout adaptation is proposed. The algorithm uses a classical batch learning of ESNs. By treating the network echo states as fixed basis

An Integrated Emissions Calculation and Data Management Tool for Nonroad Sources in Texas

by Rick Baker, Diane Preusse
"... An integrated emissions calculation and data management tool was developed for nonroad mobile sources in Texas. The Nonroad Analysis and Emissions Estimation System (NAEES) utilizes an enhanced GUI written in VB.NET, a modified version of NONROAD’s Access-based reporting utility, and a MySQL5.0 data ..."
Abstract - Add to MetaCart
An integrated emissions calculation and data management tool was developed for nonroad mobile sources in Texas. The Nonroad Analysis and Emissions Estimation System (NAEES) utilizes an enhanced GUI written in VB.NET, a modified version of NONROAD’s Access-based reporting utility, and a MySQL5
Next 10 →
Results 1 - 10 of 12
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University