Results 1  10
of
38
Strictly Proper Scoring Rules, Prediction, and Estimation
, 2007
"... Scoring rules assess the quality of probabilistic forecasts, by assigning a numerical score based on the predictive distribution and on the event or value that materializes. A scoring rule is proper if the forecaster maximizes the expected score for an observation drawn from the distribution F if he ..."
Abstract

Cited by 143 (17 self)
 Add to MetaCart
Scoring rules assess the quality of probabilistic forecasts, by assigning a numerical score based on the predictive distribution and on the event or value that materializes. A scoring rule is proper if the forecaster maximizes the expected score for an observation drawn from the distribution F if he or she issues the probabilistic forecast F, rather than G ̸ = F. It is strictly proper if the maximum is unique. In prediction problems, proper scoring rules encourage the forecaster to make careful assessments and to be honest. In estimation problems, strictly proper scoring rules provide attractive loss and utility functions that can be tailored to the problem at hand. This article reviews and develops the theory of proper scoring rules on general probability spaces, and proposes and discusses examples thereof. Proper scoring rules derive from convex functions and relate to information measures, entropy functions, and Bregman divergences. In the case of categorical variables, we prove a rigorous version of the Savage representation. Examples of scoring rules for probabilistic forecasts in the form of predictive densities include the logarithmic, spherical, pseudospherical, and quadratic scores. The continuous ranked probability score applies to probabilistic forecasts that take the form of predictive cumulative distribution functions. It generalizes the absolute error and forms a special case of a new and very general type of score, the energy score. Like many other scoring rules, the energy score admits a kernel representation in terms of negative definite functions, with links to inequalities of Hoeffding type, in both univariate and multivariate settings. Proper scoring rules for quantile and interval forecasts are also discussed. We relate proper scoring rules to Bayes factors and to crossvalidation, and propose a novel form of crossvalidation known as randomfold crossvalidation. A case study on probabilistic weather forecasts in the North American Pacific Northwest illustrates the importance of propriety. We note optimum score approaches to point and quantile
Forecasting Time Series Subject to Multiple Structural Breaks
, 2004
"... This paper provides a novel approach to forecasting time series subject to discrete structural breaks. We propose a Bayesian estimation and prediction procedure that allows for the possibility of new breaks over the forecast horizon, taking account of the size and duration of past breaks (if any) by ..."
Abstract

Cited by 50 (8 self)
 Add to MetaCart
This paper provides a novel approach to forecasting time series subject to discrete structural breaks. We propose a Bayesian estimation and prediction procedure that allows for the possibility of new breaks over the forecast horizon, taking account of the size and duration of past breaks (if any) by means of a hierarchical hidden Markov chain model. Predictions are formed by integrating over the hyper parameters from the meta distributions that characterize the stochastic break point process. In an application to US Treasury bill rates, we find that the method leads to better outofsample forecasts than alternative methods that ignore breaks, particularly at long horizons.
Probabilistic forecasts, calibration and sharpness
 Journal of the Royal Statistical Society Series B
, 2007
"... Summary. Probabilistic forecasts of continuous variables take the form of predictive densities or predictive cumulative distribution functions. We propose a diagnostic approach to the evaluation of predictive performance that is based on the paradigm of maximizing the sharpness of the predictive dis ..."
Abstract

Cited by 38 (15 self)
 Add to MetaCart
Summary. Probabilistic forecasts of continuous variables take the form of predictive densities or predictive cumulative distribution functions. We propose a diagnostic approach to the evaluation of predictive performance that is based on the paradigm of maximizing the sharpness of the predictive distributions subject to calibration. Calibration refers to the statistical consistency between the distributional forecasts and the observations and is a joint property of the predictions and the events that materialize. Sharpness refers to the concentration of the predictive distributions and is a property of the forecasts only. A simple theoretical framework allows us to distinguish between probabilistic calibration, exceedance calibration and marginal calibration. We propose and study tools for checking calibration and sharpness, among them the probability integral transform histogram, marginal calibration plots, the sharpness diagram and proper scoring rules. The diagnostic approach is illustrated by an assessment and ranking of probabilistic forecasts of wind speed at the Stateline wind energy centre in the US Pacific Northwest. In combination with crossvalidation or in the time series context, our proposal provides very general, nonparametric alternatives to the use of information criteria for model diagnostics and model selection.
Macroeconomic dynamics and credit risk: A global perspective
 Journal of Money Credit and Banking
, 2006
"... We develop a framework for modeling conditional loss distributions through the introduction of risk factor dynamics. Asset value changes of a credit portfolio are linked to a dynamic global macroeconometric model, allowing macro effects to be isolated from idiosyncratic shocks from the perspective o ..."
Abstract

Cited by 37 (8 self)
 Add to MetaCart
We develop a framework for modeling conditional loss distributions through the introduction of risk factor dynamics. Asset value changes of a credit portfolio are linked to a dynamic global macroeconometric model, allowing macro effects to be isolated from idiosyncratic shocks from the perspective of default (and hence loss). Default probabilities are driven primarily by how firms are tied to business cycles, both domestic and foreign, and how business cycles are linked across countries. The model is able to control for firmspecific heterogeneity as well as generate multiperiod forecasts of the entire loss distribution, conditional on specific macroeconomic scenarios. The approach can be used, for example, to compute the effects of a hypothetical negative equity price shock in South East Asia on the loss distribution of a credit portfolio with global exposures over one or more quarters. The approach has several other features of particular relevance for risk managers, such as the exploration of scale and symmetry of shocks, and the effect of nonnormality on credit risk. We show that the effects of such shocks on losses are asymmetric and nonproportional, reflecting the highly nonlinear nature of the credit risk model. Nonnormal innovations such as Student t generate expected and unexpected losses which increase the fatter the tails of the innovations.
Evaluating, comparing and combining density forecasts using the KLIC with an application to the Bank of England and NIESR “fan” charts of inflation
, 2005
"... ..."
Optimal combination of density forecasts
 NATIONAL INSTITUTE OF ECONOMIC AND SOCIAL RESEARCH DISCUSSION PAPER NO
, 2005
"... This paper brings together two important but hitherto largely unrelated areas of the forecasting literature, density forecasting and forecast combination. It proposes a simple datadriven approach to direct combination of density forecasts using optimal weights. These optimal weights are those weigh ..."
Abstract

Cited by 23 (9 self)
 Add to MetaCart
This paper brings together two important but hitherto largely unrelated areas of the forecasting literature, density forecasting and forecast combination. It proposes a simple datadriven approach to direct combination of density forecasts using optimal weights. These optimal weights are those weights that minimize the ‘distance’, as measured by the KullbackLeibler information criterion, between the forecasted and true but unknown density. We explain how this minimization both can and should be achieved. Comparisons with the optimal combination of point forecasts are made. An application to simple timeseries density forecasts and two widely used published density forecasts for U.K. inflation, namely the Bank of England and NIESR “fan” charts, illustrates that combination can but need not always help.
Calibrated probabilistic forecasting at the Stateline wind energy center: The regimeswitching spacetime (RST) method
 Journal of the American Statistical Association
, 2004
"... With the global proliferation of wind power, accurate shortterm forecasts of wind resources at wind energy sites are becoming paramount. Regimeswitching spacetime (RST) models merge meteorological and statistical expertise to obtain accurate and calibrated, fully probabilistic forecasts of wind s ..."
Abstract

Cited by 19 (10 self)
 Add to MetaCart
With the global proliferation of wind power, accurate shortterm forecasts of wind resources at wind energy sites are becoming paramount. Regimeswitching spacetime (RST) models merge meteorological and statistical expertise to obtain accurate and calibrated, fully probabilistic forecasts of wind speed and wind power. The model formulation is parsimonious, yet takes account of all the salient features of wind speed: alternating atmospheric regimes, temporal and spatial correlation, diurnal and seasonal nonstationarity, conditional heteroscedasticity, and nonGaussianity. The RST method identifies forecast regimes at the wind energy site and fits a conditional predictive model for each regime. Geographically dispersed meteorological observations in the vicinity of the wind farm are used as offsite predictors. The RST technique was applied to 2hour ahead forecasts of hourly average wind speed at the Stateline wind farm in the US Pacific Northwest. In July 2003, for instance, the RST forecasts had rootmeansquare error (RMSE) 28.6 % less than the persistence forecasts. For each month in the test period, the RST forecasts had lower RMSE than forecasts using stateoftheart vector time series techniques. The RST method provides probabilistic forecasts in the form of
Macroeconometric modelling with a global perspective
 The Manchester School, Supplement
, 2006
"... This paper provides a synthesis and further development of a global modelling approach introduced in Pesaran, Schuermann and Weiner (2004), where country speci…c models in the form of VARX * structures are estimated relating a vector of domestic variables, xit, to their foreign counterparts, x it, a ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
This paper provides a synthesis and further development of a global modelling approach introduced in Pesaran, Schuermann and Weiner (2004), where country speci…c models in the form of VARX * structures are estimated relating a vector of domestic variables, xit, to their foreign counterparts, x it, and then consistently combined to form a Global VAR (GVAR). It is shown that the VARX * models can be derived as the solution to a dynamic stochastic general equilibrium (DSGE) model where overidentifying longrun theoretical relations can be tested and imposed if acceptable. This gives the system a transparent longrun theoretical structure. Similarly, shortrun overidentifying theoretical restrictions can be tested and imposed if accepted. Alternatively, if one has less con…dence in the shortrun theory the dynamics can be left unrestricted. The assumption of the weak exogeneity of the foreign variables for the longrun parameters can be tested, where x it variables can be interpreted as proxies for global factors. Rather than using deviations from ad hoc statistical trends, the equilibrium values of the variables re‡ecting the longrun theory embodied in the model can be calculated. This approach has been used in a wide variety of contexts and for a wide variety of purposes. The paper also provides some new results. Keywords: Global VAR (GVAR), DSGE models, VARX*.
Model averaging and valueatrisk based evaluation of large multiasset volatility models for risk management
, 2005
"... This paper considers the problem of model uncertainty in the case of multiasset volatility models and discusses the use of model averaging techniques as a way of dealing with the risk of inadvertently using false models in portfolio management. Evaluation of volatility models is then considered and ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
This paper considers the problem of model uncertainty in the case of multiasset volatility models and discusses the use of model averaging techniques as a way of dealing with the risk of inadvertently using false models in portfolio management. Evaluation of volatility models is then considered and a simple ValueatRisk (VaR) diagnostic test is proposed for individual as well as ‘average’ models. The asymptotic as well as the exact finitesample distribution of the test statistic, dealing with the possibility of parameter uncertainty, are established. The model averaging idea and the VaR diagnostic tests are illustrated by an application to portfolios of daily returns based on twenty two of Standard & Poor’s 500 industry group indices over the period 19952003. We find strong evidence in support of ‘thick’ modelling proposed in the forecasting literature by Granger and Jeon (2004).
A Parallel CuttingPlane Algorithm for the Vehicle Routing Problem With Time Windows
, 1999
"... In the vehicle routing problem with time windows a number of identical vehicles must be routed to and from a depot to cover a given set of customers, each of whom has a specified time interval indicating when they are available for service. Each customer also has a known demand, and a vehicle may on ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
In the vehicle routing problem with time windows a number of identical vehicles must be routed to and from a depot to cover a given set of customers, each of whom has a specified time interval indicating when they are available for service. Each customer also has a known demand, and a vehicle may only serve the customers on a route if the total demand does not exceed the capacity of the vehicle. The most effective solution method proposed to date for this problem is due to Kohl, Desrosiers, Madsen, Solomon, and Soumis. Their algorithm uses a cuttingplane approach followed by a branchand bound search with column generation, where the columns of the LP relaxation represent routes of individual vehicles. We describe a new implementation of their method, using Karger's randomized minimumcut algorithm to generate cutting planes. The standard benchmark in this area is a set of 87 problem instances generated in 1984 by M. Solomon; making using of parallel processing in both the cuttingpla...