Results 1  10
of
28
Asymptotics for Lassotype estimators
, 2000
"... this paper, we consider the asymptotic behaviour of regression estimators that minimize the residual sum of squares plus a penalty proportional to ..."
Abstract

Cited by 138 (3 self)
 Add to MetaCart
this paper, we consider the asymptotic behaviour of regression estimators that minimize the residual sum of squares plus a penalty proportional to
Testing When a Parameter Is on the Boundary of the Maintained Hypothesis
 Econometrica
, 2001
"... COWLES FOUNDATION DISCUSSION PAPER NO. 1229 ..."
Hiroshi Imai and Masao Iri. Polygonal approximations of a curve – formulations and algorithms
 Computational Morphology
, 1988
"... Regularization by the sum of singular values, also referred to as the trace norm, is a popular technique for estimating low rank rectangular matrices. In this paper, we extend some of the consistency results of the Lasso to provide necessary and sufficient conditions for rank consistency of trace no ..."
Abstract

Cited by 43 (7 self)
 Add to MetaCart
Regularization by the sum of singular values, also referred to as the trace norm, is a popular technique for estimating low rank rectangular matrices. In this paper, we extend some of the consistency results of the Lasso to provide necessary and sufficient conditions for rank consistency of trace norm minimization with the square loss. We also provide an adaptive version that is rank consistent even when the necessary condition for the non adaptive version is not fulfilled. 1.
Estimation When a Parameter Is on a Boundary
 Econometrica
, 1999
"... This paper establishes the asymptotic distribution of an extremum estimator when the true parameter lies on the boundary of the parameter space. The boundary may be linear, curved, and�or kinked. Typically the asymptotic distribution is a function of a multivariate normal distribution in models with ..."
Abstract

Cited by 28 (4 self)
 Add to MetaCart
This paper establishes the asymptotic distribution of an extremum estimator when the true parameter lies on the boundary of the parameter space. The boundary may be linear, curved, and�or kinked. Typically the asymptotic distribution is a function of a multivariate normal distribution in models without stochastic trends and a function of a multivariate Brownian motion in models with stochastic trends. The results apply to a wide variety of estimators and models. Examples treated in the paper are: Ž. i quasiML estimation of a random coefficients regression model with some coefficient variances equal to zero and Ž ii. LS estimation of an augmented DickeyFuller regression with unit root and time trend parameters on the boundary of the parameter space.
GMM with many moment conditions
 Econometrica
, 2006
"... This paper provides a first order asymptotic theory for generalized method of moments (GMM) estimators when the number of moment conditions is allowed to increase with the sample size and the moment conditions may be weak. Examples in which these asymptotics are relevant include instrumental variabl ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
This paper provides a first order asymptotic theory for generalized method of moments (GMM) estimators when the number of moment conditions is allowed to increase with the sample size and the moment conditions may be weak. Examples in which these asymptotics are relevant include instrumental variable (IV) estimation with many (possibly weak or uninformed) instruments and some panel data models that cover moderate time spans and have correspondingly large numbers of instruments. Under certain regularity conditions, the GMM estimators are shown to converge in probability but not necessarily to the true parameter, and conditions for consistent GMM estimation are given. A general framework for the GMM limit distribution theory is developed based on epiconvergence methods. Some illustrations are provided, including consistent GMM estimation of a panel model with time varying individual effects, consistent limited information maximum likelihood estimation as a continuously updated GMM estimator, and consistent IV structural estimation using large numbers of weak or irrelevant instruments. Some simulations are reported.
EpiConvergence in Distribution and Stochastic EquiSemicontinuity
 C o rpusbased wo rk on discourse marke rs such as ‘ a n d ’ ,‘ i f’ , ‘ bu t ’ ,e
, 1997
"... : Epiconvergence in distribution is a useful tool in establishing limiting distributions of "argmin" estimators; however, it is not always easy to find the epilimit of a given sequence of objective functions. In this paper, we define the notion of stochastic equilowersemicontinuity of a sequence ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
: Epiconvergence in distribution is a useful tool in establishing limiting distributions of "argmin" estimators; however, it is not always easy to find the epilimit of a given sequence of objective functions. In this paper, we define the notion of stochastic equilowersemicontinuity of a sequence of random objective functions. It is shown that epiconvergence in distribution and finite dimensional convergence in distribution (to a given limit) of a sequence of random objective functions are equivalent under this condition. Key words and phrases: argmin estimators, convergence in distribution, epiconvergence, equisemicontinuity AMS 1991 subject classifications: Primary 62F12, 60F05; Secondary 62E20, 60F17. Running head: Stochastic equisemicontinuity 1 Introduction Many statistical estimators are defined as the minimizer (or maximizer) of some objective function; common examples include maximum likelihood estimation and Mestimation. Since any maximization problem can be reexp...
Likelihood ratio tests and singularities
 Ann. Statist
, 2008
"... Many statistical hypotheses can be formulated in terms of polynomial equalities and inequalities in the unknown parameters and thus correspond to semialgebraic subsets of the parameter space. We consider large sample asymptotics for the likelihood ratio test of such hypotheses in models that satisf ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
Many statistical hypotheses can be formulated in terms of polynomial equalities and inequalities in the unknown parameters and thus correspond to semialgebraic subsets of the parameter space. We consider large sample asymptotics for the likelihood ratio test of such hypotheses in models that satisfy standard probabilistic regularity conditions. We show that the assumptions of Chernoff’s theorem hold for semialgebraic sets such that the asymptotics are determined by the tangent cone at the true parameter point. At boundary points or singularities, the tangent cone need not be a linear space and limiting distributions other than chisquare distributions may arise. While boundary points often lead to mixtures of chisquare distributions, singularities give rise to nonstandard limits. We demonstrate that minima of chisquare random variables are important for locally identifiable models, and in a study of the factor analysis model with one factor, we reveal connections to eigenvalues of Wishart matrices.
REGRESSION ON MANIFOLDS: ESTIMATION OF THE EXTERIOR DERIVATIVE
 SUBMITTED TO THE ANNALS OF STATISTICS
, 2010
"... Collinearity and nearcollinearity of predictors cause difficulties when doing regression. In these cases, variable selection becomes untenable because of mathematical issues concerning the existence and numerical stability of the regression coefficients, and interpretation of the coefficients is am ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Collinearity and nearcollinearity of predictors cause difficulties when doing regression. In these cases, variable selection becomes untenable because of mathematical issues concerning the existence and numerical stability of the regression coefficients, and interpretation of the coefficients is ambiguous because gradients are not defined. Using a differential geometric interpretation, in which the regression coefficients are interpreted as estimates of the exterior derivative of a function, we develop a new method to do regression in the presence of collinearities. Our regularization scheme can improve estimation error, and it can be easily modified to include lassotype regularization. These estimators also have simple extensions to the “large p, small n” context.
On the Asymptotics of Constrained Local Mestimators
 Ann. Statist
"... We discuss in this paper asymptotics of locally optimal solutions of maximum likelihood and, more generally, Mestimation procedures in cases where the true value of the parameter vector lies on the boundary of the parameter set S. We give a counterexample showing that regularity of S in the sense ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
We discuss in this paper asymptotics of locally optimal solutions of maximum likelihood and, more generally, Mestimation procedures in cases where the true value of the parameter vector lies on the boundary of the parameter set S. We give a counterexample showing that regularity of S in the sense of Clarke is not su#cient for asymptotic equivalence of # nconsistent locally optimal M  estimators. We argue further that stronger properties, such as socalled "near convexity" or "proxregularity", of S are required in order to ensure that any two # nconsistent locally optimal Mestimators have the same asymptotics. Key words: Maximum likelihood, constrained Mestimation, asymptotic distribution, tangent cones, Clarke regularity, proxregularity, metric projection # This work was supported, in part, by grant DMI9713878 from the National Science Foundation. 1 Introduction We discuss in this paper asymptotics of maximum likelihood and, more generally, of Mestimation procedures in si...
Estimating Density Functions: A Constrained Maximum Likelihood Approach
, 1998
"... We propose estimating density functions by means of a constrained optimization problem whose criterion function is the maximum likelihood function, and whose constraints model any (prior) information that might be available. The asymptotic justification for such an approach relies on the theory of e ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We propose estimating density functions by means of a constrained optimization problem whose criterion function is the maximum likelihood function, and whose constraints model any (prior) information that might be available. The asymptotic justification for such an approach relies on the theory of epiconvergence. A simple numerical example is used to signal the potential of such an approach.