Results 1  10
of
28
Forecast Combinations
 HANDBOOK OF ECONOMIC FORECASTING
, 2006
"... Forecast combinations have frequently been found in empirical studies to produce better forecasts on average than methods based on the exante best individual forecasting model. Moreover, simple combinations that ignore correlations between forecast errors often dominate more refined combination sch ..."
Abstract

Cited by 90 (2 self)
 Add to MetaCart
Forecast combinations have frequently been found in empirical studies to produce better forecasts on average than methods based on the exante best individual forecasting model. Moreover, simple combinations that ignore correlations between forecast errors often dominate more refined combination schemes aimed at estimating the theoretically optimal combination weights. In this chapter we analyze theoretically the factors that determine the advantages from combining forecasts (for example, the degree of correlation between forecast errors and the relative size of the individual models’ forecast error variances). Although the reasons for the success of simple combination schemes are poorly understood, we discuss several possibilities related to model misspecification, instability (nonstationarities) and estimation error in situations where thenumbersofmodelsislargerelativetothe available sample size. We discuss the role of combinations under asymmetric loss and consider combinations of point, interval and probability forecasts.
Sequential procedures for aggregating arbitrary estimators of a conditional mean
, 2005
"... In this paper we describe and analyze a sequential procedure for aggregating linear combinations of a finite family of regression estimates, with particular attention to linear combinations having coefficients in the generalized simplex. The procedure is based on exponential weighting, and has a com ..."
Abstract

Cited by 34 (1 self)
 Add to MetaCart
(Show Context)
In this paper we describe and analyze a sequential procedure for aggregating linear combinations of a finite family of regression estimates, with particular attention to linear combinations having coefficients in the generalized simplex. The procedure is based on exponential weighting, and has a computationally tractable approximation. Analysis of the procedure is based in part on techniques from the sequential prediction of nonrandom sequences. Here these techniques are applied in a stochastic setting to obtain cumulative loss bounds for the aggregation procedure. From the cumulative loss bounds we derive an oracle inequality for the aggregate estimator for an unbounded response having a suitable moment generating function. The inequality shows that the risk of the aggregate estimator is less than the risk of the best candidate linear combination in the generalized simplex, plus a complexity term that depends on the size of the coefficient set. The inequality readily yields convergence rates for aggregation over the unit simplex that are within logarithmic factors of known minimax bounds. Some preliminary results on model selection are also presented.
Regression with Multiple Candidate Models: Selecting or Mixing?
 STATISTICA SINICA
, 1999
"... Model averaging provides an alternative to model selection. An algorithm ARM rooted in information theory is proposed to combine different regression models/methods. A simulation is conducted in the context of linear regression to compare its performance with familiar model selection criteria AIC ..."
Abstract

Cited by 33 (9 self)
 Add to MetaCart
(Show Context)
Model averaging provides an alternative to model selection. An algorithm ARM rooted in information theory is proposed to combine different regression models/methods. A simulation is conducted in the context of linear regression to compare its performance with familiar model selection criteria AIC and BIC, and also with some Bayesian model averaging (BMA) methods. The simulation suggests
2004), “Bagging Binary and Quantile Predictors for Time Series,” mimeo
"... Bootstrap aggregating or Bagging, introduced by Breiman (1996a), has been proved to be effective to improve on unstable forecast. Theoretical and empirical works using classification, regression trees, variable selection in linear and nonlinear regression have shown that bagging can generate substa ..."
Abstract

Cited by 17 (5 self)
 Add to MetaCart
Bootstrap aggregating or Bagging, introduced by Breiman (1996a), has been proved to be effective to improve on unstable forecast. Theoretical and empirical works using classification, regression trees, variable selection in linear and nonlinear regression have shown that bagging can generate substantial prediction gain. However, most of the existing literature on bagging has been limited to the cross sectional circumstances with symmetric cost functions. In this paper, we extend the application of bagging to time series settings with asymmetric cost functions, particularly for predicting signs and quantiles. We use quantile predictions to construct a binary predictor and the majorityvoted bagging binary prediction. We show that bagging may improve the binary prediction in small sample, but it does not improve in large sample. Various bagging forecast combination weights are used such as equal weighted and Bayesian model averaging (BMA) weighted combinations. For demonstration, we present results from Monte Carlo experiments and from empirical applications using monthly S&P500 and NASDAQ stock index returns.
Combining Time Series Models for Forecasting
, 2002
"... Statistical models (e.g., ARIMA models) have been commonly used in time series data analysis and forecasting. Typically one model is selected based on a selection criterion (e.g., AIC), hypothesis testing, and/or graphical inspections. The selected model is then used to forecast future values. Howev ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
Statistical models (e.g., ARIMA models) have been commonly used in time series data analysis and forecasting. Typically one model is selected based on a selection criterion (e.g., AIC), hypothesis testing, and/or graphical inspections. The selected model is then used to forecast future values. However, model selection is often unstable and may cause an unnecessarily high variability in the final estimation/prediction. In this work, we propose the use of an algorithm AFTER to convexly combine the models for a better performance of prediction. The weights are sequentially updated after each additional observation. Simulations and real data examples are used to compare performance of our approach with model selection methods. The results show advantage of combining by AFTER over selection in term of forecasting accuracy at several settings.
Time series forecasting for dynamic environments: the DyFor genetic program model
 Proceedings of the International Seminar on Soft Computing and Intelligent Systems (WISIS’04
"... Several studies have applied genetic programming (GP) to the task of forecasting with favorable results. However, these studies, like those applying other techniques, have assumed a static environment, making them unsuitable for many realworld time series which are generated by varying processes. T ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
(Show Context)
Several studies have applied genetic programming (GP) to the task of forecasting with favorable results. However, these studies, like those applying other techniques, have assumed a static environment, making them unsuitable for many realworld time series which are generated by varying processes. This study investigates the development of a new “dynamic ” GP model that is specifically tailored for forecasting in nonstatic environments. This Dynamic Forecasting Genetic Program (DyFor GP) model incorporates features that allow it to adapt to changing environments automatically as well as retain knowledge learned from previously encountered environments. The DyFor GP model is tested for forecasting efficacy on both simulated and real time series including the U.S. Gross Domestic Product and Consumer Price Index Inflation. Results show that the performance of the DyFor GP model improves upon that of benchmark models for all experiments. These findings highlight the DyFor GP’s potential as an adaptive, nonlinear model for realworld forecasting applications and suggest further investigations.
INFLATION FORECASTS, MONETARY POLICY AND UNEMPLOYMENT DYNAMICS EVIDENCE FROM THE US AND THE EURO AREA 1
, 2007
"... In 2007 all ECB publications feature a motif taken from the €20 banknote. ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
In 2007 all ECB publications feature a motif taken from the €20 banknote.
VAR Model Averaging for MultiStep Forecasting
, 2007
"... An electronic version of the paper may be downloaded from the Ifo website www.ifo.de. ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
An electronic version of the paper may be downloaded from the Ifo website www.ifo.de.
The distribution of model averaging estimators and an impossibility result regarding its estimation
 IMS Lecture Notes–Monograph Series 52
, 2006
"... Abstract: The
nitesample as well as the asymptotic distribution of Leung and Barrons (2006) model averaging estimator are derived in the context of a linear regression model. An impossibility result regarding the estimation of the nitesample distribution of the model averaging estimator is obtain ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Abstract: The
nitesample as well as the asymptotic distribution of Leung and Barrons (2006) model averaging estimator are derived in the context of a linear regression model. An impossibility result regarding the estimation of the nitesample distribution of the model averaging estimator is obtained. 1.
Online Forecast Combination for Dependent Heterogeneous Data
, 2007
"... This paper studies a procedure to combine individual forecasts that achieve theoretical optimal performance. The results apply to a wide variety of loss functions and no conditions are imposed on the data sequences and the individual forecasts apart from a tail condition. The theoretical results sho ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
This paper studies a procedure to combine individual forecasts that achieve theoretical optimal performance. The results apply to a wide variety of loss functions and no conditions are imposed on the data sequences and the individual forecasts apart from a tail condition. The theoretical results show that the bounds are also valid in the case of time varying combination weights, under specific conditions on the nature of time variation. Some experimental evidence to confirm the results is provided.