Results 1 
5 of
5
A survey of Monte Carlo algorithms for maximizing the likelihood of a twostage hierarchical model
, 2001
"... Likelihood inference with hierarchical models is often complicated by the fact that the likelihood function involves intractable integrals. Numerical integration (e.g. quadrature) is an option if the dimension of the integral is low but quickly becomes unreliable as the dimension grows. An alternati ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Likelihood inference with hierarchical models is often complicated by the fact that the likelihood function involves intractable integrals. Numerical integration (e.g. quadrature) is an option if the dimension of the integral is low but quickly becomes unreliable as the dimension grows. An alternative approach is to approximate the intractable integrals using Monte Carlo averages. Several dierent algorithms based on this idea have been proposed. In this paper we discuss the relative merits of simulated maximum likelihood, Monte Carlo EM, Monte Carlo NewtonRaphson and stochastic approximation. Key words and phrases : Eciency, Monte Carlo EM, Monte Carlo NewtonRaphson, Rate of convergence, Simulated maximum likelihood, Stochastic approximation All three authors partially supported by NSF Grant DMS0072827. 1 1
On the Convergence of the Monte Carlo Maximum Likelihood Method for Latent Variable Models
"... This paper studies the asymptotic performances of the MCML method (in the number of observations n) against the choice of ' and of the number of simulations s n used in the importance sampling approximation. We provide sucient conditions for the MCML estimator to converge to the true value of the pa ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper studies the asymptotic performances of the MCML method (in the number of observations n) against the choice of ' and of the number of simulations s n used in the importance sampling approximation. We provide sucient conditions for the MCML estimator to converge to the true value of the parameter with n. Our results imply in particular that the initialization parameter ' must be a
unknown title
"... Abstract While much used in practice, latent variable models raise challenging estimation problems related with the intractability of their likelihoods. Monte Carlo Maximum Likelihood (MCML) is a simulationbased approach to likelihood approximation that has been proposed for complex latent variable ..."
Abstract
 Add to MetaCart
Abstract While much used in practice, latent variable models raise challenging estimation problems related with the intractability of their likelihoods. Monte Carlo Maximum Likelihood (MCML) is a simulationbased approach to likelihood approximation that has been proposed for complex latent variable models for which deterministic optimization procedures such as the ExpectationMaximization approach are not applicable. It is based on an importance sampling identity for the likelihood ratio, where the importance function is the complete model density at a given parameter value '. This paper studies the asymptotic performance of the MCML method (in the number of observations n) against the choice of ' and of the number of simulations s n used
May 2010Higher Order Improvements for Approximate
, 2010
"... Many modern estimation methods in econometrics approximate an objective function, through simulation or discretization for instance. The resulting “approximate ” estimator is often biased; and it always incurs an efficiency loss. We here propose three methods to improve the properties of such approx ..."
Abstract
 Add to MetaCart
Many modern estimation methods in econometrics approximate an objective function, through simulation or discretization for instance. The resulting “approximate ” estimator is often biased; and it always incurs an efficiency loss. We here propose three methods to improve the properties of such approximate estimators at a low computational cost. The first two methods correct the objective function so as to remove the leading term of the bias due to the approximation. One variant provides an analytical bias adjustment, but it only works for estimators based on stochastic approximators, such as simulationbased estimators. Our second bias correction is based on ideas from the resampling literature; it eliminates the leading bias term for nonstochastic as well as stochastic approximators. Finally, we propose an iterative procedure where we use NewtonRaphson (NR) iterations based on a much finer degree of approximation. The NR step removes some or all of the additional bias and variance of the initial approximate estimator. A Monte Carlo simulation on the mixed logit model shows that noticeable improvements can be obtained rather cheaply.
A Panel Data Stochastic Frontier Model with Autocorrelated Inefficiency
"... Evolution of the economic efficiency of a firm over time takes places through a process where the level of efficiency at a particular point of time depends upon its past level of efficiency. Be it the firm’s ability to adopt the technology or to take the “right ’ decision under uncertainty regarding ..."
Abstract
 Add to MetaCart
Evolution of the economic efficiency of a firm over time takes places through a process where the level of efficiency at a particular point of time depends upon its past level of efficiency. Be it the firm’s ability to adopt the technology or to take the “right ’ decision under uncertainty regarding the scale of operation and expansion, the effect of the “past experience ” on the