Results 1  10
of
136
Large Sample Sieve Estimation of SemiNonparametric Models
 Handbook of Econometrics
, 2007
"... Often researchers find parametric models restrictive and sensitive to deviations from the parametric specifications; seminonparametric models are more flexible and robust, but lead to other complications such as introducing infinite dimensional parameter spaces that may not be compact. The method o ..."
Abstract

Cited by 93 (13 self)
 Add to MetaCart
Often researchers find parametric models restrictive and sensitive to deviations from the parametric specifications; seminonparametric models are more flexible and robust, but lead to other complications such as introducing infinite dimensional parameter spaces that may not be compact. The method of sieves provides one way to tackle such complexities by optimizing an empirical criterion function over a sequence of approximating parameter spaces, called sieves, which are significantly less complex than the original parameter space. With different choices of criteria and sieves, the method of sieves is very flexible in estimating complicated econometric models. For example, it can simultaneously estimate the parametric and nonparametric components in seminonparametric models with or without constraints. It can easily incorporate prior information, often derived from economic theory, such as monotonicity, convexity, additivity, multiplicity, exclusion and nonnegativity. This chapter describes estimation of seminonparametric econometric models via the method of sieves. We present some general results on the large sample properties of the sieve estimates, including consistency of the sieve extremum estimates, convergence rates of the sieve Mestimates, pointwise normality of series estimates of regression functions, rootn asymptotic normality and efficiency of sieve estimates of smooth functionals of infinite dimensional parameters. Examples are used to illustrate the general results.
Network tomography: recent developments
 Statistical Science
, 2004
"... Today's Int ernet is a massive, dist([/#][ net work which cont inuest o explode in size as ecommerce andrelatH actH]M/# grow. Thehet([H(/#]H( and largelyunregulatS stregula of t/ Int/HH3 renderstnde such as dynamicroutc/[ opt2]3fl/ service provision, service level verificatflH( and det(2][/ of ..."
Abstract

Cited by 87 (4 self)
 Add to MetaCart
Today's Int ernet is a massive, dist([/#][ net work which cont inuest o explode in size as ecommerce andrelatH actH]M/# grow. Thehet([H(/#]H( and largelyunregulatS stregula of t/ Int/HH3 renderstnde such as dynamicroutc/[ opt2]3fl/ service provision, service level verificatflH( and det(2][/ of anomalous/malicious behaviorext/[(22 challenging. The problem is compounded bytS fact tct onecannot rely ont[ cooperatH2 of individual servers and routSS t aid intS collect[3 of net workt/[S measurement vits fort/]3 t/]3] In many ways, net workmonit]/#[ and inference problems bear a st[fl[ resemblancet otnc "inverse problems" in which key aspect of asystfl are not direct/ observable. Familiar signal processing orst[]23/#[S problems such ast omographic imagereconst[/#[S] and phylogenet# tog identn/HH2[M have int erest3/ connect[HU t tonn arising in net working. This artflMM int/ ducesnet workt/H3]S]/ y, a new field which we believe will benefit greatU from tm wealt of stH2](/#S( ttH2 andalgorit#S( It focuses especially on recent development s int2 field includingtl applicat[fl of pseudolikelihoodmetfl ds andt reeestfl3](/# formulat]M23 Keyw ords:Net workt/HflS33/ y, pseudolikelihood,t opology identn/]H22(/ tn est/]H tst 1 Introducti6 Nonet work is an island, ent/S ofitS[S] everynet work is a piece of an int/]SS work, a part of t/ main . Alt[]][ administHSHSS of smallscale net works can monit( localt ra#ccondit][/ and ident ify congest/# point s and performance botU((2/ ks, very few net works are complet/# # Rui Castroan Robert Nowak are with theDepartmen t of Electricalan ComputerEnterX Rice Unc ersity,Houston TX; Mark Coates is with the Departmen t of Electricalan ComputerEnterX McGill UnG ersity,Mon treal, Quebec,Can Gan Lian an Bin Yu are with theDepartmen t of Statistics,...
Pseudo Likelihood Estimation in Network Tomography
, 2003
"... Network monitoring and diagnosis are key to improving network performance. The difficulties of performance monitoring lie in today's fast growing Internet, accompanied by increasingly heterogeneous and unregulated structures. Moreover, these tasks become even harder since one cannot rely on the ..."
Abstract

Cited by 65 (4 self)
 Add to MetaCart
Network monitoring and diagnosis are key to improving network performance. The difficulties of performance monitoring lie in today's fast growing Internet, accompanied by increasingly heterogeneous and unregulated structures. Moreover, these tasks become even harder since one cannot rely on the collaboration of individual routers and servers to directly measure network traffic. Even though the aggregatory nature of possible network measurements gives rise to inverse problems, existing methods for solving inverse problems are usually computationally intractable or statistically inefficient.
Estimating macroeconomic models: a likelihood approach
, 2006
"... This paper shows how particle filtering facilitates likelihoodbased inference in dynamic macroeconomic models. The economies can be nonlinear and/or nonnormal. We describe how to use the output from the particle filter to estimate the structural parameters of the model, those characterizing prefer ..."
Abstract

Cited by 61 (21 self)
 Add to MetaCart
This paper shows how particle filtering facilitates likelihoodbased inference in dynamic macroeconomic models. The economies can be nonlinear and/or nonnormal. We describe how to use the output from the particle filter to estimate the structural parameters of the model, those characterizing preferences and technology, and to compare different economies. Both tasks can be implemented from either a classical or a Bayesian perspective. We illustrate the technique by estimating a business cycle model with investmentspecific technological change, preference shocks, and stochastic volatility.
Strong Optimality of the Normalized ML Models as Universal Codes
 IEEE Transactions on Information Theory
, 2000
"... We show that the normalized maximum likelihood (NML) distribution as a universal code for a parametric class of models is closest to the negative logarithm of the maximized likelihood in the mean code length distance, where the mean is taken with respect to the worst case model inside or outside ..."
Abstract

Cited by 59 (7 self)
 Add to MetaCart
We show that the normalized maximum likelihood (NML) distribution as a universal code for a parametric class of models is closest to the negative logarithm of the maximized likelihood in the mean code length distance, where the mean is taken with respect to the worst case model inside or outside the parametric class. We strengthen this result by showing that the same minmax bound results even when the data generating models are restricted to be most `benevolent' in minimizing the mean of the negative logarithm of the maximized likelihood. Further, we show for the class of exponential models that the bound cannot be beaten in essence by any code except when the mean is taken with respect to the most benevolent data generating models in a set of vanishing size. These results allow us to decompose the data into two parts, the first having all the useful information that can be extracted with the parametric models and the rest which has none. We also show that, if we change Ak...
Consistent Specification Testing With Nuisance Parameters Present Only Under The Alternative
, 1995
"... . The nonparametric and the nuisance parameter approaches to consistently testing statistical models are both attempts to estimate topological measures of distance between a parametric and a nonparametric fit, and neither dominates in experiments. This topological unification allows us to greatly ex ..."
Abstract

Cited by 57 (10 self)
 Add to MetaCart
. The nonparametric and the nuisance parameter approaches to consistently testing statistical models are both attempts to estimate topological measures of distance between a parametric and a nonparametric fit, and neither dominates in experiments. This topological unification allows us to greatly extend the nuisance parameter approach. How and why the nuisance parameter approach works and how it can be extended bears closely on recent developments in artificial neural networks. Statistical content is provided by viewing specification tests with nuisance parameters as tests of hypotheses about Banachvalued random elements and applying the Banach Central Limit Theorem and Law of Iterated Logarithm, leading to simple procedures that can be used as a guide to when computationally more elaborate procedures may be warranted. 1. Introduction In testing whether or not a parametric statistical model is correctly specified, there are a number of apparently distinct approaches one might take. T...
Tests of conditional predictive ability
 Econometrica
, 2006
"... We argue that the current framework for predictive ability testing (e.g.,West, 1996) is not necessarily useful for realtime forecast selection, i.e., for assessing which of two competing forecasting methods will perform better in the future. We propose an alternative framework for outofsample com ..."
Abstract

Cited by 51 (1 self)
 Add to MetaCart
We argue that the current framework for predictive ability testing (e.g.,West, 1996) is not necessarily useful for realtime forecast selection, i.e., for assessing which of two competing forecasting methods will perform better in the future. We propose an alternative framework for outofsample comparison of predictive ability which delivers more practically relevant conclusions. Our approach is based on inference about conditional expectations of forecasts and forecast errors rather than the unconditional expectations that are the focus of the existing literature. We capture important determinants of forecast performance that are neglected in the existing literature by evaluating what we call the forecasting method (the model and the parameter estimation procedure), rather than just the forecasting model. Compared to previous approaches, our tests are valid under more general data assumptions (heterogeneity rather than stationarity) and estimation methods, and they can handle comparison of both nested and nonnested models, which is not currently possible. To illustrate the usefulness of the proposed tests, we compare the forecast performance of three leading parameterreduction methods for macroeconomic forecasting using a large number of predictors: a sequential model selection approach,
Semiparametric Estimation of FirstPrice Auctions with Risk Averse Bidders
, 2000
"... This paper proposes a semiparametric estimation procedure of the firstprice auction model with risk averse bidders within the independent private values paradigm. We show that the model is nonidentified in general from observed bids. Moreover, any distribution of bids can be rationalized by an auct ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
This paper proposes a semiparametric estimation procedure of the firstprice auction model with risk averse bidders within the independent private values paradigm. We show that the model is nonidentified in general from observed bids. Moreover, any distribution of bids can be rationalized by an auction model with either constant relative risk aversion or constant absolute risk aversion. Thus identification of the model must be achieved through additional restrictions. We then establish semiparametric identification under a common but unknown support condition and parameterization of the bidders' utility function. Next we propose a semiparametric method for estimating the corresponding auction model using local polynomial estimators. This method involves several steps and allows to recover the parameters of the utility function as well as the bidders' private values and their distribution. An attractive computational advantage of our method is that it does not require solving the differential equation characterizing the equilibrium strategy. An illustration of the method on U.S. Forest Service timber sales is proposed. In particular, a test of bidders' risk neutrality is performed.
Estimation of multivariate models for time series of possibly different lenghts
 Journal of Applied Econometrics
, 2006
"... We consider the problem of estimating parametric multivariate density models when unequal amounts of data are available on each variable. We focus in particular on the case that the unknown parameter vector may be partitioned into elements relating only to a marginal distribution and elements relati ..."
Abstract

Cited by 30 (4 self)
 Add to MetaCart
We consider the problem of estimating parametric multivariate density models when unequal amounts of data are available on each variable. We focus in particular on the case that the unknown parameter vector may be partitioned into elements relating only to a marginal distribution and elements relating to the copula. In such a case we propose using a multistage maximum likelihood estimator (MSMLE) based on all available data rather than the usual onestage maximum likelihood estimator (1SMLE) based only on the overlapping data. We provide conditions under which the MSMLE is not less asymptotically efficient than the 1SMLE, and we examine the small sample efficiency of the estimators via simulations. The analysis in this paper is motivated by a model of the joint distribution of daily Japanese yen–US dollar and euro–US dollar exchange rates. We find significant evidence of time variation in the conditional copula of these exchange rates, and evidence of greater dependence during extreme events than under the normal distribution. Copyright © 2006