Results 1  10
of
47
Asymptotic properties of the maximum likelihood estimator in autoregressive models with Markov regime
 ANN. STATIST
, 2004
"... An autoregressive process with Markov regime is an autoregressive process for which the regression function at each time point is given by a nonobservable Markov chain. In this paper we consider the asymptotic properties of the maximum likelihood estimator in a possibly nonstationary process of this ..."
Abstract

Cited by 34 (6 self)
 Add to MetaCart
An autoregressive process with Markov regime is an autoregressive process for which the regression function at each time point is given by a nonobservable Markov chain. In this paper we consider the asymptotic properties of the maximum likelihood estimator in a possibly nonstationary process of this kind for which the hidden state space is compact but not necessarily finite. Consistency and asymptotic normality are shown to follow from uniform exponential forgetting of the initial distribution for the hidden Markov chain conditional on the observations.
Geoadditive Models
, 2000
"... this paper is a recent article on modelbased geostatistics by Diggle, Tawn and Moyeed (1998) where pure kriging (i.e. no covariates) is the focus. Our paper inherits some of its aspects: modelbased and with mixed model connections. In particular the comment by Bowman (1998) in the ensuing discussi ..."
Abstract

Cited by 33 (1 self)
 Add to MetaCart
this paper is a recent article on modelbased geostatistics by Diggle, Tawn and Moyeed (1998) where pure kriging (i.e. no covariates) is the focus. Our paper inherits some of its aspects: modelbased and with mixed model connections. In particular the comment by Bowman (1998) in the ensuing discussion suggested that additive modelling would be a worthwhile extension. This paper essentially follows this suggestion. However, this paper is not the first to combine the notions of geostatistics and additive modelling. References known to us are Kelsall and Diggle (1998), Durban Reguera (1998) and Durban, Hackett, Currie and Newton (2000). Nevertheless, we believe that our approach has a number of attractive features (see (1)(4) above), not all shared by these references. Section 2 describes the motivating application and data in detail. Section 3 shows how one can express additive models as a mixed model, while Section 4 does the same for kriging and merges the two into the geoadditive model. Issues concerning the amount of smoothing are discussed in Section 5 and inferential aspects are treated in Section 6. Our analysis of the Upper Cape Cod reproductive data is presented in Section 7. Section 8 discusses extension to the generalised context.We close the paper with some disussion in Section 9. 2 Description of the application and data
A survey of Monte Carlo algorithms for maximizing the likelihood of a twostage hierarchical model
, 2001
"... Likelihood inference with hierarchical models is often complicated by the fact that the likelihood function involves intractable integrals. Numerical integration (e.g. quadrature) is an option if the dimension of the integral is low but quickly becomes unreliable as the dimension grows. An alternati ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Likelihood inference with hierarchical models is often complicated by the fact that the likelihood function involves intractable integrals. Numerical integration (e.g. quadrature) is an option if the dimension of the integral is low but quickly becomes unreliable as the dimension grows. An alternative approach is to approximate the intractable integrals using Monte Carlo averages. Several dierent algorithms based on this idea have been proposed. In this paper we discuss the relative merits of simulated maximum likelihood, Monte Carlo EM, Monte Carlo NewtonRaphson and stochastic approximation. Key words and phrases : Eciency, Monte Carlo EM, Monte Carlo NewtonRaphson, Rate of convergence, Simulated maximum likelihood, Stochastic approximation All three authors partially supported by NSF Grant DMS0072827. 1 1
Hierarchical Models: A Current Computational Perspective
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2000
"... Hierarchical models (HMs) provide a flexible framework for modeling data. The ongoing development of techniques like the EM algorithm and Markov chain Monte Carlo has enabled statisticians to make use of increasingly more complicated HMs over the last few decades. In this article, we consider Bay ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Hierarchical models (HMs) provide a flexible framework for modeling data. The ongoing development of techniques like the EM algorithm and Markov chain Monte Carlo has enabled statisticians to make use of increasingly more complicated HMs over the last few decades. In this article, we consider Bayesian and frequentist versions of a general, twostage HM, and describe several examples from the literature that illustrate its versatility. Some key aspects of the computational techniques that are currently used in conjunction with this HM are then examined in the context of McCullagh and Nelder's (1989) salamander data. Several areas that are ripe for new research are identified.
Analysis of spatial data using generalized linear mixed models and Langevintype Markov chain Monte Carlo
, 2000
"... Markov chain Monte Carlo methods are useful in connection with inference and prediction for spatial generalized linear mixed models, where the unobserved random effects constitute a spatially correlated Gaussian random field. We point out that socalled Langevintype updates are useful for Metropoli ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Markov chain Monte Carlo methods are useful in connection with inference and prediction for spatial generalized linear mixed models, where the unobserved random effects constitute a spatially correlated Gaussian random field. We point out that socalled Langevintype updates are useful for MetropolisHastings simulation of the posterior distribution of the random eects given the data. Furthermore, we discuss the use of improper priors in Bayesian analysis of spatial generalized linear mixed models with particular emphasis on the socalled Poissonlog normal model. For this and certain other models nonparametric estimation of the covariance function of the Gaussian field is also studied. The methods are applied to various data sets including counts of weed plants on a field.
QuasiMonte Carlo sampling to improve the efficiency of Monte Carlo EM
 Computational Statistics and Data Analysis
, 2005
"... In this paper we investigate an efficient implementation of the Monte Carlo EM algorithm based on QuasiMonte Carlo sampling. The Monte Carlo EM algorithm is a stochastic version of the deterministic EM (ExpectationMaximization) algorithm in which an intractable Estep is replaced by a Monte Carlo ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
In this paper we investigate an efficient implementation of the Monte Carlo EM algorithm based on QuasiMonte Carlo sampling. The Monte Carlo EM algorithm is a stochastic version of the deterministic EM (ExpectationMaximization) algorithm in which an intractable Estep is replaced by a Monte Carlo approximation. QuasiMonte Carlo methods produce deterministic sequences of points that can significantly improve the accuracy of Monte Carlo approximations over purely random sampling. One drawback to deterministic QuasiMonte Carlo methods is that it is generally difficult to determine the magnitude of the approximation error. However, in order to implement the Monte Carlo EM algorithm in an automated way, the ability to measure this error is fundamental. Recent developments of randomized QuasiMonte Carlo methods can overcome this drawback. We investigate the implementation of an automated, datadriven Monte Carlo EM algorithm based on randomized QuasiMonte Carlo methods. We apply this algorithm to a geostatistical model of online purchases and find that it can significantly decrease the total simulation effort, thus showing great potential for improving upon the efficiency of the classical Monte Carlo EM algorithm. Key words and phrases: Monte Carlo error; lowdiscrepancy sequence; Halton sequence; EM algorithm; geostatistical model.
Simple Incorporation of Interactions Into Additive Models
, 2000
"... This article presents penalized spline models that incorporate factor by curve interactions into additive models. A mixed model formulation for penalized splines allows for straightforward model fitting, smoothing parameter selection, and hypothesis testing. We illustrate the proposed model by apply ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
This article presents penalized spline models that incorporate factor by curve interactions into additive models. A mixed model formulation for penalized splines allows for straightforward model fitting, smoothing parameter selection, and hypothesis testing. We illustrate the proposed model by applying it to pollen ragweed data in which seasonal trends vary by year.
Ascentbased Monte Carlo EM
, 2004
"... The EM algorithm is a popular tool for maximizing likelihood functions in the presence of missing data. Unfortunately, EM often requires the evaluation of analytically intractable and highdimensional integrals. The Monte Carlo EM (MCEM) algorithm is the natural extension of EM that employs Monte Ca ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
The EM algorithm is a popular tool for maximizing likelihood functions in the presence of missing data. Unfortunately, EM often requires the evaluation of analytically intractable and highdimensional integrals. The Monte Carlo EM (MCEM) algorithm is the natural extension of EM that employs Monte Carlo methods to estimate the relevant integrals. Typically, a very large Monte Carlo sample size is required to estimate these integrals within an acceptable tolerance when the algorithm is near convergence. Even if this sample size were known at the onset of implementation of MCEM, its use throughout all iterations is wasteful, especially when accurate starting values are not available. We propose a datadriven strategy for controlling Monte Carlo resources in MCEM. The proposed algorithm improves on similar existing methods by: (i) recovering EM’s ascent (i.e., likelihoodincreasing) property with high probability, (ii) being more robust to the impact of user defined inputs, and (iii) handling classical Monte Carlo and Markov chain Monte Carlo within a common framework. Because of (i) we refer to the algorithm as “Ascentbased MCEM”. We apply Ascentbased MCEM to a variety of examples, including one where it is used to dramatically accelerate the convergence of deterministic EM.
Personalized Recommendation of User Comments via Factor Models
"... In recent years, the amount of usergenerated opinionated texts (e.g., reviews, user comments) continues to grow at a rapid speed: featured news stories on a major event easily attract thousands of user comments on a popular online News service. How to consume subjective information of this volume b ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
In recent years, the amount of usergenerated opinionated texts (e.g., reviews, user comments) continues to grow at a rapid speed: featured news stories on a major event easily attract thousands of user comments on a popular online News service. How to consume subjective information of this volume becomes an interesting and important research question. In contrast to previous work on review analysis that tried to filter or summarize information for a generic average user, we explore a different direction of enabling personalized recommendation of such information. For each user, our task is to rank the comments associated with a given article according to personalized user preference (i.e., whether the user is likely to like or dislike the comment). To this end, we propose a factor model that incorporates ratercomment and raterauthor interactions simultaneously in a principled way. Our full model significantly outperforms strong baselines as well as related models that have been considered in previous work. 1