Results 1  10
of
303
Efficient Simulation from the Multivariate Normal and Studentt Distributions Subject to Linear Constraints and the Evaluation of Constraint Probabilities
, 1991
"... The construction and implementation of a Gibbs sampler for efficient simulation from the truncated multivariate normal and Studentt distributions is described. It is shown how the accuracy and convergence of integrals based on the Gibbs sample may be constructed, and how an estimate of the probabil ..."
Abstract

Cited by 133 (8 self)
 Add to MetaCart
The construction and implementation of a Gibbs sampler for efficient simulation from the truncated multivariate normal and Studentt distributions is described. It is shown how the accuracy and convergence of integrals based on the Gibbs sample may be constructed, and how an estimate of the probability of the constraint set under the unrestricted distribution may be produced. Keywords: Bayesian inference; Gibbs sampler; Monte Carlo; multiple integration; truncated normal This paper was prepared for a presentation at the meeting Computing Science and Statistics: the TwentyThird Symposium on the Interface, Seattle, April 2224, 1991. Research assistance from Zhenyu Wang and financial support from National Science Foundation Grant SES8908365 are gratefully acknowledged. The software for the examples may be requested by electronic mail, and will be returned by that medium. 2 1. Introduction The generation of random samples from a truncated multivariate normal distribution, that is, a ...
Markov Chain Monte Carlo Estimation of Exponential Random Graph Models
 Journal of Social Structure
, 2002
"... This paper is about estimating the parameters of the exponential random graph model, also known as the p # model, using frequentist Markov chain Monte Carlo (MCMC) methods. The exponential random graph model is simulated using Gibbs or MetropolisHastings sampling. The estimation procedures consider ..."
Abstract

Cited by 109 (16 self)
 Add to MetaCart
This paper is about estimating the parameters of the exponential random graph model, also known as the p # model, using frequentist Markov chain Monte Carlo (MCMC) methods. The exponential random graph model is simulated using Gibbs or MetropolisHastings sampling. The estimation procedures considered are based on the RobbinsMonro algorithm for approximating a solution to the likelihood equation.
An empirical framework for testing theories about complementarity in organizational design, NBER working paper 6600; download: http://www.nber.org/papers/w6600.pdf
, 1998
"... ABSTRACT: This paper studies alternative empirical strategies for estimating the effects of organizational design practices on performance, as well as the factors which determine organizational design, in a crosssection of firms. In particular, we propose an approach for estimating the parameters o ..."
Abstract

Cited by 103 (6 self)
 Add to MetaCart
ABSTRACT: This paper studies alternative empirical strategies for estimating the effects of organizational design practices on performance, as well as the factors which determine organizational design, in a crosssection of firms. In particular, we propose an approach for estimating the parameters of an “organizational design production function. ” Further, we identify consistent tests for two classes of hypotheses: first, that some sets of organizational design practices are mutually complementary; and second, that adoption patterns are consistent with static optimization of the organization’s profit. We develop an economic model where multiple organizational design practices are endogenously determined. The model includes exogenous variation in the costs and returns to each of the individual practices, which is the source of the heterogeneity among organizations. In many empirical applications, some of these variables will be unobserved to the econometrician. The model is used to evaluate how different econometric strategies can be interpreted under alternative assumptions about the economic and statistical environment. Of particular interest are a set of results which demonstrate that, under plausible hypotheses about the joint distribution of the unobservables, different reducedform approaches used in the existing literature to test
An exact likelihood analysis of the multinomial probit model
, 1994
"... We develop new methods for conducting a finite sample, likelihoodbased analysis of the multinomial probit model. Using a variant of the Gibbs sampler, an algorithm is developed to draw from the exact posterior of the multinomial probit model with correlated errors. This approach avoids direct evalu ..."
Abstract

Cited by 96 (4 self)
 Add to MetaCart
We develop new methods for conducting a finite sample, likelihoodbased analysis of the multinomial probit model. Using a variant of the Gibbs sampler, an algorithm is developed to draw from the exact posterior of the multinomial probit model with correlated errors. This approach avoids direct evaluation of the likelihood and, thus, avoids the problems associated with calculating choice probabilities which affect both the standard likelihood and method of simulated moments approaches. Both simulated and actual consumer panel data are used to fit sixdimensional choice models. We also develop methods for analyzing random coefficient and multiperiod probit models.
Mobility and the return to education: Testing a Roy Model with multiple markets
 ECONOMETRICA
, 2002
"... Selfselected migration presents one potential explanation for why observed returns to a college education in local labor markets vary widely even though U.S. workers are highly mobile. To assess the impact of selfselection on estimated returns, this paper first develops a Roy model of mobility and ..."
Abstract

Cited by 70 (0 self)
 Add to MetaCart
Selfselected migration presents one potential explanation for why observed returns to a college education in local labor markets vary widely even though U.S. workers are highly mobile. To assess the impact of selfselection on estimated returns, this paper first develops a Roy model of mobility and earnings where workers choose in which of the 50 states (plus the District of Columbia) to live and work. Available estimation methods are either infeasible for a selection model with so many alternatives or place potentially severe restrictions on earnings and the selection process. This paper develops an alternative econometric methodology which combines Lee's (1983) parametric maximum order statistic approach to reduce the dimensionality of the error terms with more recent work on semiparametric estimation of selection models (e.g., Ahn and Powell, 1993). The resulting semiparametric correction is easy to implement and can be adapted to a variety of other polychotomous choice problems. The empirical work, which uses 1990 U.S. Census data, confirms the role of comparative advantage in mobility decisions. The results suggest that selfselection of higher educated individuals to states with higher returns to education generally leads to upward biases in OLS estimates of the returns to education in statespecific labor markets. While the estimated returns to a college education are significantly biased, correcting for the bias does not narrow the range of returns across states. Consistent with the finding that the corrected return to a college education differs across the U.S., the relative statetostate migration flows of college versus high schooleducated individuals respond strongly to differences in the return to education and amenities across states.
Estimating macroeconomic models: a likelihood approach
, 2006
"... This paper shows how particle filtering facilitates likelihoodbased inference in dynamic macroeconomic models. The economies can be nonlinear and/or nonnormal. We describe how to use the output from the particle filter to estimate the structural parameters of the model, those characterizing prefer ..."
Abstract

Cited by 61 (21 self)
 Add to MetaCart
This paper shows how particle filtering facilitates likelihoodbased inference in dynamic macroeconomic models. The economies can be nonlinear and/or nonnormal. We describe how to use the output from the particle filter to estimate the structural parameters of the model, those characterizing preferences and technology, and to compare different economies. Both tasks can be implemented from either a classical or a Bayesian perspective. We illustrate the technique by estimating a business cycle model with investmentspecific technological change, preference shocks, and stochastic volatility.
Economic Choices
 American Economic Review
, 2001
"... ome detail more recent developments in the economic theory of choice, and modifications to this theory that are being forced by experimental evidence from cognitive psychology. I will close with a survey of statistical methods that have developed as part of the research program on economic choice be ..."
Abstract

Cited by 59 (2 self)
 Add to MetaCart
ome detail more recent developments in the economic theory of choice, and modifications to this theory that are being forced by experimental evidence from cognitive psychology. I will close with a survey of statistical methods that have developed as part of the research program on economic choice behavior. Science is a cooperative enterprise, and my work on choice behavior reflects not only my own ideas, but the results of exchange and collaboration with many other scholars. 1 First, of course, is my colaureate James Heckman, who among his many contributions pioneered the important area of dynamic discrete choice analysis. Nine other individuals who played a major role in channeling microeconometrics and choice theory toward their modern forms, and had a particularly important influence on my own work, are Zvi Griliches, L.L. Thurstone, Jacob Marschak, Duncan Luce, Danny Kahneman, Amos Tversky, Moshe BenAkiva, Charles Manski, and Kenneth Train. A gallery of their p
Testing When a Parameter Is on the Boundary of the Maintained Hypothesis
 Econometrica
, 2001
"... COWLES FOUNDATION DISCUSSION PAPER NO. 1229 ..."
Valuing New Goods in a Model with Complementarities: Online Newspapers,” working paper
, 2004
"... Many important economic questions hinge on the extent to which new goods either crowd out or complement consumption of existing products. Recent methods for studying new goods are based on demand models that rule out complementarity by assumption, so their applicability to these questions has been l ..."
Abstract

Cited by 46 (4 self)
 Add to MetaCart
Many important economic questions hinge on the extent to which new goods either crowd out or complement consumption of existing products. Recent methods for studying new goods are based on demand models that rule out complementarity by assumption, so their applicability to these questions has been limited. I develop a new model that relaxes this restriction, and use it to study the specific case of competition between print and online newspapers. Using new micro data from the Washington DC market, I show that the major print and online papers appear to be strong complements in the raw data, but that this is an artifact of unobserved consumer heterogeneity. I estimate that the online paper reduced print readership by 27,000 per day, at a cost of $5.5 million per year in lost print profits. I find that online news has provided substantial welfare benefits to consumers and that charging positive online prices is unlikely to substantially increase firm profits. JEL classification:C25,L82