Results 1  10
of
38
Nonlinear Wavelet Methods for Recovery of Signals, Densities, and Spectra from Indirect and Noisy Data
 In Proceedings of Symposia in Applied Mathematics
, 1993
"... . We describe wavelet methods for recovery of objects from noisy and incomplete data. The common themes: (a) the new methods utilize nonlinear operations in the wavelet domain; (b) they accomplish tasks which are not possible by traditional linear/Fourier approaches to such problems. We attempt to i ..."
Abstract

Cited by 103 (5 self)
 Add to MetaCart
. We describe wavelet methods for recovery of objects from noisy and incomplete data. The common themes: (a) the new methods utilize nonlinear operations in the wavelet domain; (b) they accomplish tasks which are not possible by traditional linear/Fourier approaches to such problems. We attempt to indicate the heuristic principles, theoretical foundations, and possible application areas for these methods. Areas covered: (1) Wavelet DeNoising. (2) Wavelet Approaches to Linear Inverse Problems. (4) Wavelet Packet DeNoising. (5) Segmented MultiResolutions. (6) Nonlinear Multiresolutions. 1. Introduction. With the rapid development of computerized scientific instruments comes a wide variety of interesting problems for data analysis and signal processing. In fields ranging from Extragalactic Astronomy to Molecular Spectroscopy to Medical Imaging to Computer Vision, one must recover a signal, curve, image, spectrum, or density from incomplete, indirect, and noisy data. What can wavelets ...
Maximal Spaces with given rate of convergence for thresholding algorithms
, 1999
"... this paper is to discuss the existence and the nature of maximal spaces in the context of nonlinear methods based on thresholding (or shrinkage) procedures. Before going further, some remarks should be made: ..."
Abstract

Cited by 36 (7 self)
 Add to MetaCart
this paper is to discuss the existence and the nature of maximal spaces in the context of nonlinear methods based on thresholding (or shrinkage) procedures. Before going further, some remarks should be made:
Nonlinear BlackBox Models in System Identification: Mathematical Foundations
, 1995
"... In this paper we discuss several aspects of the mathematical foundations of nonlinear blackbox identification problem. As we shall see that the quality of the identification procedure is always a result of a certain tradeoff between the expressive power of the model we try to identify (the larger ..."
Abstract

Cited by 30 (5 self)
 Add to MetaCart
In this paper we discuss several aspects of the mathematical foundations of nonlinear blackbox identification problem. As we shall see that the quality of the identification procedure is always a result of a certain tradeoff between the expressive power of the model we try to identify (the larger is the number of parameters used to describe the model, more flexible would be the approximation), and the stochastic error (which is proportional to the number of parameters). A consequence of this tradeoff is a simple fact that good approximation technique can be a basis of good identification algorithm. From this point of view we consider different approximation methods, and pay special attention to spatially adaptive approximants. We introduce wavelet and "neuron" approximations and show that they are spatially adaptive. Then we apply the acquired approximation experience to estimation problems. Finally, we consider some implications of these theoretic developments for the practically...
Exact Risk Analysis of Wavelet Regression
, 1995
"... Wavelets have motivated development of a host of new ideas in nonparametric regression smoothing. Here we apply the tool of exact risk analysis, to understand the small sample behavior of wavelet estimators, and thus to check directly the conclusions suggested by asymptotics. Comparisons between som ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
Wavelets have motivated development of a host of new ideas in nonparametric regression smoothing. Here we apply the tool of exact risk analysis, to understand the small sample behavior of wavelet estimators, and thus to check directly the conclusions suggested by asymptotics. Comparisons between some wavelet bases, and also between hard and soft thresholding are given from several viewpoints. Our results provide insight as to why the viewpoints and conclusions of Donoho and Johnstone differ from those of Hall and Patil. 1 Introduction In a series of papers, Donoho and Johnstone (1992 [9],1994a [10], 1995 [13]) and Donoho, Johnstone, Kerkyacharian and Picard (1995) [14] developed nonlinear wavelet shrinkage technology in nonparametric regression. For other work relating wavelets and nonparametric estimation, see Doukhan (1988) [15], Kerkyacharian and Picard, (1992) [21], Antoniadis (1994) [1] and Antoniadis, Gregoire and McKeague (1994) [2]. These papers have both introduced a new clas...
Estimating The Square Root Of A Density Via Compactly Supported Wavelets
, 1997
"... This paper addresses the problem of univariate density estimation in a novel way. Our approach falls in the class of so called projection estimators, introduced by ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
This paper addresses the problem of univariate density estimation in a novel way. Our approach falls in the class of so called projection estimators, introduced by
Wavelet thresholding for nonnecessarily Gaussian noise: idealism
 Annals of Statistics
, 2003
"... For various types of noise (exponential, normal mixture, compactly supported,...) wavelet tresholding methods are studied. Problems linked to the existence of optimal thresholds are tackled, and minimaxity properties of the methods also analyzed. A coefficient dependent method for choosing threshold ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
For various types of noise (exponential, normal mixture, compactly supported,...) wavelet tresholding methods are studied. Problems linked to the existence of optimal thresholds are tackled, and minimaxity properties of the methods also analyzed. A coefficient dependent method for choosing thresholds is also briefly presented. 1. Introduction. A
Wavelet Estimators, Global Error Measures: Revisited
 Technical Report. IRISAINRIA. Available at http://www.irisa.fr
, 1993
"... : In the paper minimax rates of convergence for wavelet estimators are studied. For the problems of density estimation and nonparametric regression we establish upper bounds over a large range of functional classes and global error measures. The constructed estimate is simultaneously minimax (up to ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
: In the paper minimax rates of convergence for wavelet estimators are studied. For the problems of density estimation and nonparametric regression we establish upper bounds over a large range of functional classes and global error measures. The constructed estimate is simultaneously minimax (up to constant) for all L ß error measures, 0 ! ß 1. Keywords: minimax estimation, density estimation, nonparametric regression, Besov spaces, wavelet estimators. (R'esum'e : tsvp) This is the author to whom correspondence should be sent ; email: iouditsk@irisa.fr. Centre National de la Recherche Scientifique Institut National de Recherche en Informatique (URA 227) Universit e de Rennes 1  Insa de Rennes et en Automatique  unit e de recherche de Rennes Estimateurs par Ondelettes, Majorations d'Erreur Globale R'esum'e : On 'etudie dans cet article la vitesse de convergence des estimateurs par ondelettes. Pour les probl`emes d'estimation de densit'e et de r'egression nonparam'etrique...
Minimax nonparametric classification  Part I: Rates of convergence

, 1998
"... This paper studies minimax aspects of nonparametric classification. We first study minimax estimation of the conditional probability of a class label, given the feature variable. This function, say f � is assumed to be in a general nonparametric class. We show the minimax rate of convergence under ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
This paper studies minimax aspects of nonparametric classification. We first study minimax estimation of the conditional probability of a class label, given the feature variable. This function, say f � is assumed to be in a general nonparametric class. We show the minimax rate of convergence under square L 2 loss is determined by the massiveness of the class as measured by metric entropy. The second part of the paper studies minimax classification. The loss of interest is the difference between the probability of misclassification of a classifier and that of the Bayes decision. As is wellknown, an upper bound on risk for estimating f gives an upper bound on the risk for classification, but the rate is known to be suboptimal for the class of monotone functions. This suggests that one does not have to estimate f well in order to classify well. However, we show that the two problems are in fact of the same difficulty in terms of rates of convergence under a sufficient condition, which is satisfied by many function classes including Besov (Sobolev), Lipschitz, and bounded variation. This is somewhat surprising in view of a result of Devroye, Györfi, and Lugosi (1996).
Estimating copula densities through wavelets
 Insurance Math. Econom
, 2009
"... Laboratoire de probabilités et modèles aléatoires ..."