Results 1 
7 of
7
Structural adaptation via Lp–norm oracle inequalities
, 2008
"... In this paper we study the problem of adaptive estimation of a multivariate function satisfying some structural assumption. We propose a novel estimation procedure that adapts simultaneously to unknown structure and smoothness of the underlying function. The problem of structural adaptation is state ..."
Abstract

Cited by 20 (8 self)
 Add to MetaCart
In this paper we study the problem of adaptive estimation of a multivariate function satisfying some structural assumption. We propose a novel estimation procedure that adapts simultaneously to unknown structure and smoothness of the underlying function. The problem of structural adaptation is stated as the problem of selection from a given collection of estimators. We develop a general selection rule and establish for it global oracle inequalities under arbitrary Lp–losses. These results are applied for adaptive estimation in the additive multi–index model.
Nonparametric estimation of composite functions
 ANN. STAT
, 2009
"... We study the problem of nonparametric estimation of a multivariate function g:R d → R that can be represented as a composition of two unknown smooth functions f:R→R and G:R d → R. We suppose that f and G belong to known smoothness classes of functions, with smoothness γ and β, respectively. We obtai ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
We study the problem of nonparametric estimation of a multivariate function g:R d → R that can be represented as a composition of two unknown smooth functions f:R→R and G:R d → R. We suppose that f and G belong to known smoothness classes of functions, with smoothness γ and β, respectively. We obtain the full description of minimax rates of estimation of g in terms of γ and β, and propose rateoptimal estimators for the supnorm loss. For the construction of such estimators, we first prove an approximation result for composite functions that may have an independent interest, and then a result on adaptation to the local structure. Interestingly, the construction of rateoptimal estimators for composite functions (with given, fixed smoothness) needs adaptation, but not in the traditional sense: it is now adaptation to the local structure. We prove that composition models generate only two types of local structures: the local singleindex model and the local model with roughness isolated to a single dimension (i.e., a model containing elements of both additive and singleindex structure). We also find the zones of (γ, β) where no local structure is generated, as well as the zones where the composition modeling leads to faster rates, as compared to the classical nonparametric rates that depend only to the overall smoothness of g.
SHARP ESTIMATION IN SUP NORM WITH RANDOM DESIGN
, 2005
"... The aim of this paper is to recover the regression function with sup norm loss. We construct an asymptotically sharp estimator which converges with the spatially dependent rate rn,µ(x) = P ( log n/(nµ(x)) ) s/(2s+1) where µ is the design density, s the regression smoothness, n the sample size and P ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
The aim of this paper is to recover the regression function with sup norm loss. We construct an asymptotically sharp estimator which converges with the spatially dependent rate rn,µ(x) = P ( log n/(nµ(x)) ) s/(2s+1) where µ is the design density, s the regression smoothness, n the sample size and P is a constant expressed in terms of a solution to a problem of optimal recovery as in Donoho (1994). We prove this result under the assumption that µ is positive and continuous. This estimator combines kernel and local polynomial methods, where the kernel is given by optimal recovery, which allows to prove the result up to the constants for any s> 0. Moreover, the estimator does not depend on µ. We prove that rn,µ(x) is optimal in a sense which is stronger than the classical minimax lower bound. Then, an inhomogeneous confidence band is proposed. This band has a non constant length which depends on the local amount of data.
Maxiset in supnorm for kernel estimators
, 2008
"... In the Gaussian white noise model, we study the estimation of an unknown multidimensional function f in the uniform norm by using kernel methods. The performances of procedures are measured by using the maxiset point of view: we determine the set of functions which are well estimated (at a prescrib ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
In the Gaussian white noise model, we study the estimation of an unknown multidimensional function f in the uniform norm by using kernel methods. The performances of procedures are measured by using the maxiset point of view: we determine the set of functions which are well estimated (at a prescribed rate) by each procedure. So, in this paper, we determine the maxisets associated to kernel estimators and to the Lepski procedure for the rate of convergence of the form (log n/n) β/(2β+d). We characterize the maxisets in terms of Besov and Hölder spaces of regularity β.
Adaptive estimation on anisotropic Hölder spaces Part II. Partially adaptive case
, 2006
"... In this paper, we consider a particular case of adaptation. Let us recall that, in the first paper “Fully case”, a large collecton of anisotropic Hölder spaces is fixed and the goal is to construct an adaptive estimator with respect to the absolutely unknown smoothness parameter. Here the problem is ..."
Abstract
 Add to MetaCart
In this paper, we consider a particular case of adaptation. Let us recall that, in the first paper “Fully case”, a large collecton of anisotropic Hölder spaces is fixed and the goal is to construct an adaptive estimator with respect to the absolutely unknown smoothness parameter. Here the problem is quite different: an additionnal information is known, the effective smoothness of the signal. We prove a minimax result which demonstrates that a knowledge of is type is useful because the rate of convergence is better than that obtained without knowledge of the effective smothness. Moreover we linked this problem with the maxiset theory.
Hyperbolic wavelet thresholding methods and the curse of dimensionality through the maxiset approach
 APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS
, 2013
"... ..."