Results 1  10
of
30
Asymptotic Equivalence of Density Estimation and Gaussian White Noise
 Ann. Statist
, 1996
"... Signal recovery in Gaussian white noise with variance tending to zero has served for some time as a representative model for nonparametric curve estimation, having all the essential traits in a pure form. The equivalence has mostly been stated informally, but an approximation in the sense of Le Cam' ..."
Abstract

Cited by 81 (3 self)
 Add to MetaCart
Signal recovery in Gaussian white noise with variance tending to zero has served for some time as a representative model for nonparametric curve estimation, having all the essential traits in a pure form. The equivalence has mostly been stated informally, but an approximation in the sense of Le Cam's deficiency distance \Delta would make it precise. The models are then asymptotically equivalent for all purposes of statistical decision with bounded loss. In nonparametrics, a first result of this kind has recently been established for Gaussian regression (Brown and Low, 1993). We consider the analogous problem for the experiment given by n i. i. d. observations having density f on the unit interval. Our basic result concerns the parameter space of densities which are in a Holder ball with exponent ff ? 1 2 and and which are uniformly bounded away from zero. We show that an i. i. d. sample of size n with density f is globally asymptotically equivalent to a white noise experiment with dri...
General empirical Bayes wavelet methods and exactly adaptive minimax estimation

, 2005
"... In many statistical problems, stochastic signals can be represented as a sequence of noisy wavelet coefficients. In this paper, we develop general empirical Bayes methods for the estimation of true signal. Our estimators approximate certain oracle separable rules and achieve adaptation to ideal risk ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
In many statistical problems, stochastic signals can be represented as a sequence of noisy wavelet coefficients. In this paper, we develop general empirical Bayes methods for the estimation of true signal. Our estimators approximate certain oracle separable rules and achieve adaptation to ideal risks and exact minimax risks in broad collections of classes of signals. In particular, our estimators are uniformly adaptive to the minimum risk of separable estimators and the exact minimax risks simultaneously in Besov balls of all smoothness and shape indices, and they are uniformly superefficient in convergence rates in all compact sets in Besov spaces with a finite secondary shape parameter. Furthermore, in classes nested between Besov balls of the same smoothness index, our estimators dominate threshold and James–Stein estimators within an infinitesimal fraction of the minimax risks. More general block empirical Bayes estimators are developed. Both white noise with drift and nonparametric regression are considered.
Asymptotic equivalence of spectral density and regression estimation. Technical report, Weierstrass Institute for Applied Analysis and Stochastics
, 1998
"... We consider the statistical experiment given by a sample y(1),...,y(n) of a stationary Gaussian process with an unknown smooth spectral density f. Asymptotic equivalence, in the sense of Le Cam’s deficiency ∆distance, to two Gaussian experiments with simpler structure is established. The first one ..."
Abstract

Cited by 13 (7 self)
 Add to MetaCart
We consider the statistical experiment given by a sample y(1),...,y(n) of a stationary Gaussian process with an unknown smooth spectral density f. Asymptotic equivalence, in the sense of Le Cam’s deficiency ∆distance, to two Gaussian experiments with simpler structure is established. The first one is given by independent zero mean Gaussians with variance approximately f(ωi) where ωi is a uniform grid of points in (−π, π) (nonparametric Gaussian scale regression). This approximation is closely related to wellknown asymptotic independence results for the periodogram and corresponding inference methods. The second asymptotic equivalence is to a Gaussian white noise model where the drift function is the logspectral density. This represents the step from a Gaussian scale model to a location model, and also has a counterpart in established inference methods, i.e. logperiodogram regression. The problem of simple explicit equivalence maps (Markov kernels), allowing to directly carry over inference, appears in this context but is not solved here. 1 Introduction and main results
Drift estimation for nonparametric diffusion
 Then Annals of Statistics
, 2000
"... We consider a nonparametric diffusion process whose drift and diffusion coefficients are nonparametric functions of the state variable. The goal is to estimate the unknown drift coefficient. We apply a locally linear smoother with a datadriven bandwidth choice. The procedure is fully adaptive and n ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
We consider a nonparametric diffusion process whose drift and diffusion coefficients are nonparametric functions of the state variable. The goal is to estimate the unknown drift coefficient. We apply a locally linear smoother with a datadriven bandwidth choice. The procedure is fully adaptive and nearly optimal up to a log log factor. The results about the quality of estimation are nonasymptotic and do not require any ergodic or mixing properties of the observed process.
M.: Asymptotic statistical equivalence for scalar ergodic diffusions
 Probab. Theory Rel. Fields
, 2006
"... Abstract. For scalar diffusion models with unknown drift function asymptotic equivalence in the sense of Le Cam’s deficiency between statistical experiments is considered under longtime asymptotics. A local asymptotic equivalence result is established with an accompanying sequence of simple Gaussia ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Abstract. For scalar diffusion models with unknown drift function asymptotic equivalence in the sense of Le Cam’s deficiency between statistical experiments is considered under longtime asymptotics. A local asymptotic equivalence result is established with an accompanying sequence of simple Gaussian shift experiments. Corresponding globally asymptotically equivalent experiments are obtained as compound experiments. The results are extended in several directions including time discretisation. An explicit transformation of decision functions from the Gaussian to the diffusion experiment is constructed. 1.
ROBUST NONPARAMETRIC ESTIMATION VIA WAVELET MEDIAN REGRESSION
, 810
"... In this paper we develop a nonparametric regression method that is simultaneously adaptive over a wide range of function classes for the regression function and robust over a large collection of error distributions, including those that are heavytailed, and may not even possess variances or means. ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
In this paper we develop a nonparametric regression method that is simultaneously adaptive over a wide range of function classes for the regression function and robust over a large collection of error distributions, including those that are heavytailed, and may not even possess variances or means. Our approach is to first use local medians to turn the problem of nonparametric regression with unknown noise distribution into a standard Gaussian regression problem and then apply a wavelet block thresholding procedure to construct an estimator of the regression function. It is shown that the estimator simultaneously attains the optimal rate of convergence over a wide range of the Besov classes, without prior knowledge of the smoothness of the underlying functions or prior knowledge of the error distribution. The estimator also automatically adapts to the local smoothness of the underlying function, and attains the local adaptive minimax rate for estimating functions at a point. A key technical result in our development is a quantile coupling theorem which gives a tight bound for the quantile coupling between the sample medians and a normal variable. This median coupling inequality may be of independent interest.
Statistical Properties of the Method of Regularization with Periodic Gaussian Reproducing Kernel
 Annals of Statistics
, 2004
"... The method of regularization with the Gaussian reproducing kernel is popular in the machine learning literature and successful in many practical applications. In this paper we consider the periodic version of the Gaussian kernel regularization. We show in the white noise model setting, that in funct ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
The method of regularization with the Gaussian reproducing kernel is popular in the machine learning literature and successful in many practical applications. In this paper we consider the periodic version of the Gaussian kernel regularization. We show in the white noise model setting, that in function spaces of very smooth functions, such as the infiniteorder Sobolev space and the space of analytic functions, the method under consideration is asymptotically minimax; in finiteorder Sobolev spaces, the method is rate optimal, and the efficiency in terms of constant when compared with the minimax estimator is reasonably high. The smoothing parameters in the periodic Gaussian regularization can be chosen adaptively without loss of asymptotic efficiency. The results derived in this paper give a partial explanation of the success of the Gaussian reproducing kernel in practice. Simulations are carried out to study the finite sample properties of the periodic Gaussian regularization. 1. Introduction. The
On Estimating A Dynamic Function Of A Stochastic System With Averaging
, 1997
"... We consider a twoscaled diffusion system, when drift and diffusion parameters of the "slow" component are contaminated by the "fast" unobserved component. The goal is to estimate the dynamic function which is defined by averaging the drift coefficient of the "slow" component w.r.t. the stationary d ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
We consider a twoscaled diffusion system, when drift and diffusion parameters of the "slow" component are contaminated by the "fast" unobserved component. The goal is to estimate the dynamic function which is defined by averaging the drift coefficient of the "slow" component w.r.t. the stationary distribution of the "fast" one. We apply a locally linear smoother with a datadriven bandwidth choice. The procedure is fully adaptive and nearly optimal up to a log log factor.
Asymptotic equivalence for nonparametric regression with multivariate and random design
, 2008
"... We show that nonparametric regression is asymptotically equivalent in Le Cam’s sense with a sequence of Gaussian white noise experiments as the number of observations tends to infinity. We propose a general constructive framework based on approximation spaces, which permits to achieve asymptotic equ ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We show that nonparametric regression is asymptotically equivalent in Le Cam’s sense with a sequence of Gaussian white noise experiments as the number of observations tends to infinity. We propose a general constructive framework based on approximation spaces, which permits to achieve asymptotic equivalence even in the cases of multivariate and random design.
Minimax hypothesis testing for curve registration
 Electron. J. Statist
"... Abstract: This paper is concerned with the problem of goodnessoffit for curve registration, and more precisely for the shifted curve model, whose application field reaches from computer vision and road traffic prediction to medicine. We give bounds for the asymptotic minimax separation rate, when ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract: This paper is concerned with the problem of goodnessoffit for curve registration, and more precisely for the shifted curve model, whose application field reaches from computer vision and road traffic prediction to medicine. We give bounds for the asymptotic minimax separation rate, when the functions in the alternative lie in Sobolev balls and the separation from the null hypothesis is measured by the l2norm. We use the generalized likelihood ratio to build a nonadaptive procedure depending on a tuning parameter, which we choose in an optimal way according to the smoothness of the ambient space. Then, a Bonferroni procedure is applied to give an adaptive test over a range of Sobolev balls. Both achieve the asymptotic minimax separation rates, up to possible logarithmic factors.