Results 1  10
of
26
Assessment and Propagation of Model Uncertainty
, 1995
"... this paper I discuss a Bayesian approach to solving this problem that has long been available in principle but is only now becoming routinely feasible, by virtue of recent computational advances, and examine its implementation in examples that involve forecasting the price of oil and estimating the ..."
Abstract

Cited by 120 (0 self)
 Add to MetaCart
this paper I discuss a Bayesian approach to solving this problem that has long been available in principle but is only now becoming routinely feasible, by virtue of recent computational advances, and examine its implementation in examples that involve forecasting the price of oil and estimating the chance of catastrophic failure of the U.S. Space Shuttle.
Forecast Combinations
 Handbook of Economic Forecasting
, 2006
"... Forecast combinations have frequently been found in empirical studies to produce better forecasts on average than methods based on the exante best individual forecasting model. Moreover, simple combinations that ignore correlations between forecast errors often dominate more refined combination sch ..."
Abstract

Cited by 53 (2 self)
 Add to MetaCart
Forecast combinations have frequently been found in empirical studies to produce better forecasts on average than methods based on the exante best individual forecasting model. Moreover, simple combinations that ignore correlations between forecast errors often dominate more refined combination schemes aimed at estimating the theoretically optimal combination weights. In this chapter we analyze theoretically the factors that determine the advantages from combining forecasts (for example, the degree of correlation between forecast errors and the relative size of the individual models’ forecast error variances). Although the reasons for the success of simple combination schemes are poorly understood, we discuss several possibilities related to model misspecification, instability (nonstationarities) and estimation error in situations where thenumbersofmodelsislargerelativetothe available sample size. We discuss the role of combinations under asymmetric loss and consider combinations of point, interval and probability forecasts. Key words: Forecast combinations; pooling and trimming; shrinkage methods; model misspecification, diversification gains
Predictive Ability with Cointegrated Variables
 Journal of Econometrics
, 2001
"... In this paper we outline conditions under which the Diebold and Mariano (DM: 1995) test for predictive ability can be extended to the case of two forecasting models, each of which may include cointegrating relations, when allowing for parameter estimation error. We show that in the cases where eithe ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
In this paper we outline conditions under which the Diebold and Mariano (DM: 1995) test for predictive ability can be extended to the case of two forecasting models, each of which may include cointegrating relations, when allowing for parameter estimation error. We show that in the cases where either the loss function is quadratic or the length of the prediction period, P, grows at a slower rate than the length of the regression period, R, the standard DM test can be used. On the other hand, in the case of a generic loss function, if P R ! as T ! 1, 0 < < 1, then the asymptotic normality result of West (1996) no longer holds. We also extend the "data snooping" technique of White (2000) for comparing the predictive ability of multiple forecasting models to the case of cointegrated variables. In a series of Monte Carlo experiments, we examine the impact of both short run and cointegrating vector parameter estimation error on DM, data snooping, and related tests. Our results sugge...
Further Results on Bayesian Method of Moments Analysis of the Multiple Regression Model
 Internat. Econom. Rev
, 1998
"... The Bayesian Method of Moments (BMOM) was introduced in 1994 to permit investigators to make inverse probability statements regarding parameters' possible values given the data when the form of the likelihood function is unknown. BMOM has been applied in analyses of several statistical and econ ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
The Bayesian Method of Moments (BMOM) was introduced in 1994 to permit investigators to make inverse probability statements regarding parameters' possible values given the data when the form of the likelihood function is unknown. BMOM has been applied in analyses of several statistical and econometric models including location, multiple and multivariate regression, and simultaneous equation models. In Zellner (1996, 1997a) and Zellner and Sacks (1996) some previous BMOM analyses of the multiple regression model have appeared that permit derivation of postdata densities for parameters and future observations to be calculated without use of a likelihood function, prior density, or Bayes' Theorem. In the present paper, we extend previous analyses by showing how information about a variance parameter and its relation to regression coefficients affects postdata densities. We also discuss estimation of functions of parameters and model selection techniques using BMOM and traditional Bayesi...
A Note on Aggregation, Disaggregation and Forecasting Performance
, 1999
"... this paper are first converted to real quantities by dividing each variable by a countryspecific price index. The variables are then logged, firstdifferenced, and multiplied by 100 to convert to growth rates. Estimation results for the AR(3)LI model in equation (1) are presented in Table 3, and coe ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
this paper are first converted to real quantities by dividing each variable by a countryspecific price index. The variables are then logged, firstdifferenced, and multiplied by 100 to convert to growth rates. Estimation results for the AR(3)LI model in equation (1) are presented in Table 3, and coefficient posterior means and standard deviations for the models in (2) and (3) with coefficients restricted to be equal across countries are presented in Table 4. On computing the roots of the AR(3) process for the countries' output growth rates from equation(2),
Bayesian Method of Moments (BMOM) Analysis of Parametric and Semiparametric Regression Models
 South African Statistical Journal
, 1997
"... The Bayesian Method of Moments is applied to semiparametric regression models using alternative series expansions of an unknown regression function. We describe estimation loss functions, predictive loss functions and posterior odds as techniques to determine how many terms in a particular expans ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The Bayesian Method of Moments is applied to semiparametric regression models using alternative series expansions of an unknown regression function. We describe estimation loss functions, predictive loss functions and posterior odds as techniques to determine how many terms in a particular expansion to keep and how to choose among different types of expansions. The developed theory is then applied in a MonteCarlo experiment to data generated from a CES production function. 1 Introduction In this paper, we take up the Bayesian Method of Moments (BMOM) analysis of parametric and semiparametric models. In previous work, Zellner (1994, 1995, 1996, 1997), Zellner and Sacks (1996), Tobias and Zellner (1997), Green and Strawderman (1996) and Currie (1996), the BMOM approach has been described and applied to parametric models. University of Chicago, University of Chicago, and Chung Ang University, respectively. Research financed in part by the National Science Foundation and by income ...
Cahill “Adaptive combination of linear predictors for lossless image compression
 IEE Proc. Sci. Meas. Technol
, 2000
"... Lossless image coding is an essential requirement for medical imaging applications. Lossless image compression techniques usually have two major components: adaptive prediction and adaptive entropy coding. This paper is concerned with adaptive prediction. Recently, several researchers have studied p ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Lossless image coding is an essential requirement for medical imaging applications. Lossless image compression techniques usually have two major components: adaptive prediction and adaptive entropy coding. This paper is concerned with adaptive prediction. Recently, several researchers have studied prediction schemes in which the final prediction is formed by a combination of a group of subpredictors. In this paper, we present an overview of this new type of prediction technique. We show that the basic principle of adaptive predictor combination has been extensively studied and applied to many science and engineering problems. We then describe our combination scheme which is based on the estimation of the local prediction error variance. Experimental results show that the compression performance of the algorithms that employ this new type of predictor is consistently better than that of stateoftheart algorithms. 1
Methodology for Bayesian Model Averaging: An Update
"... The standard practice of selecting a single model from some class of models and then making inferences based on this model ignores model uncertainty. Ignoring model uncertainty can impair predictive performance and lead to overstatement of the strength of evidence via pvalues that are too small. Ba ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The standard practice of selecting a single model from some class of models and then making inferences based on this model ignores model uncertainty. Ignoring model uncertainty can impair predictive performance and lead to overstatement of the strength of evidence via pvalues that are too small. Bayesian model averaging provides a coherent approach for accounting for model uncertainty. A variety of methods for implementing Bayesian model averaging have been developed. A brief overview of Bayesian model averaging is provided and recently developed methodology to perform Bayesian model averaging in specific model classes is described. Literature references as well as software descriptions and relevant webpage addresses are provided. 1
Bayesian and NonBayesian Approaches to Scientific Modeling and Inference in Economics and Econometrics
"... After brief remarks on the history of modeling and inference techniques in economics and econometrics, attention is focused on the emergence of economic science in the 20th century. First, the broad objectives of science and the PearsonJeffreys' "unity of science" principle will be r ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
After brief remarks on the history of modeling and inference techniques in economics and econometrics, attention is focused on the emergence of economic science in the 20th century. First, the broad objectives of science and the PearsonJeffreys' "unity of science" principle will be reviewed. Second, key Bayesian and nonBayesian practical scientific inference and decision methods will be compared using applied examples from economics, econometrics and business. Third, issues and controversies on how to model the behavior of economic units and systems will be reviewed and the structural econometric modeling, time series analysis (SEMTSA) approach will be described and illustrated using a macroeconomic modeling and forecasting problem involving analyses of data for 18 industrialized countries over the years since the 1950s. Point and turning point forecasting results and their implications for macroeconomic modeling of economies will be summarized. Last, a few remarks will be made ...
Bayesian Method of Moments Analysis of Time Series Models with an Application to Forecasting Turning Points in Output Growth Rates
, 1998
"... Bayesian method of moments (BMOM) analyses of central time series models are presented. These include derivations of post data densities for parameters, predictive densities for future observations and relative expected losses associated with alternative model specifications, e.g. a unit root ver ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Bayesian method of moments (BMOM) analyses of central time series models are presented. These include derivations of post data densities for parameters, predictive densities for future observations and relative expected losses associated with alternative model specifications, e.g. a unit root versus a nonunit root AR(1) process or an AR(1) versus higher order AR processes. BMOM results are compared with those provided by traditional Bayesian and nonBayesian approaches. An application to forecasting turning points in 18 countries' annual output growth rates, 19801995 is provided using several variants of an autoregressive leading indicator model. Optimal forests include not only forecasts of dichotomous outcomes, e.g. downturn or no downturn, as in previous work, but also trichotomous outcomes, e.g., minor downturn, major downturn or no downturn or minor upturn, major upturn or no upturn. Empircal results indicate that about 70 percent of dichotomous outcomes are forecasted...