Results 1  10
of
37
Calibration and Empirical Bayes Variable Selection
 Biometrika
, 1997
"... this paper, is that with F =2logp. This choice was proposed by Foster &G eorge (1994) where it was called the Risk Inflation Criterion (RIC) because it asymptotically minimises the maximum predictive risk inflation due to selection when X is orthogonal. This choice and its minimax property were ..."
Abstract

Cited by 128 (19 self)
 Add to MetaCart
this paper, is that with F =2logp. This choice was proposed by Foster &G eorge (1994) where it was called the Risk Inflation Criterion (RIC) because it asymptotically minimises the maximum predictive risk inflation due to selection when X is orthogonal. This choice and its minimax property were also discovered independently by Donoho & Johnstone (1994) in the wavelet regression context, where they refer to it as the universal hard thresholding rule
The practical implementation of Bayesian model selection
 Institute of Mathematical Statistics
, 2001
"... In principle, the Bayesian approach to model selection is straightforward. Prior probability distributions are used to describe the uncertainty surrounding all unknowns. After observing the data, the posterior distribution provides a coherent post data summary of the remaining uncertainty which is r ..."
Abstract

Cited by 94 (3 self)
 Add to MetaCart
In principle, the Bayesian approach to model selection is straightforward. Prior probability distributions are used to describe the uncertainty surrounding all unknowns. After observing the data, the posterior distribution provides a coherent post data summary of the remaining uncertainty which is relevant for model selection. However, the practical implementation of this approach often requires carefully tailored priors and novel posterior calculation methods. In this article, we illustrate some of the fundamental practical issues that arise for two different model selection problems: the variable selection problem for the linear model and the CART model selection problem.
Wavelet estimators in nonparametric regression: a comparative simulation study
 Journal of Statistical Software
, 2001
"... OATAO is an open access repository that collects the work of Toulouse researchers and makes it freely available over the web where possible. ..."
Abstract

Cited by 86 (17 self)
 Add to MetaCart
OATAO is an open access repository that collects the work of Toulouse researchers and makes it freely available over the web where possible.
Optimal Predictive Model Selection
 Ann. Statist
, 2002
"... Often the goal of model selection is to choose a model for future prediction, and it is natural to measure the accuracy of a future prediction by squared error loss. ..."
Abstract

Cited by 53 (2 self)
 Add to MetaCart
(Show Context)
Often the goal of model selection is to choose a model for future prediction, and it is natural to measure the accuracy of a future prediction by squared error loss.
The variable selection problem
 Journal of the American Statistical Association
, 2000
"... The problem of variable selection is one of the most pervasive model selection problems in statistical applications. Often referred to as the problem of subset selection, it arises when one wants to model the relationship between a variable of interest and a subset of potential explanatory variables ..."
Abstract

Cited by 44 (3 self)
 Add to MetaCart
The problem of variable selection is one of the most pervasive model selection problems in statistical applications. Often referred to as the problem of subset selection, it arises when one wants to model the relationship between a variable of interest and a subset of potential explanatory variables or predictors, but there is uncertainty about which subset to use. This vignette reviews some of the key developments which have led to the wide variety of approaches for this problem. 1
Wavelet thresholding via mdl for natural images
 IEEE Transactions on Information Theory
, 2000
"... We study the application of Rissanen's Principle of Minimum Description Length (MDL) to the problem of wavelet denoising and compression for natural images. After making a connection between thresholding and model selection, we derive an MDL criterion based on a Laplacian model for noiseless w ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
(Show Context)
We study the application of Rissanen's Principle of Minimum Description Length (MDL) to the problem of wavelet denoising and compression for natural images. After making a connection between thresholding and model selection, we derive an MDL criterion based on a Laplacian model for noiseless wavelet coe cients. We nd that this approach leads to an adaptive thresholding rule. While achieving mean squared error performance comparable with other popular thresholding schemes, the MDL procedure tends to keep far fewer coe cients. From this property, we demonstrate that our method is an excellent tool for simultaneous denoising and compression. We make this claim precise by analyzing MDL thresholding in two optimality frameworks; one in which we measure rate and distortion based on quantized coe cients and one in which we do not quantize, but instead record rate simply as the number of nonzero coe cients.
Empirical Bayes approach to block wavelet function estimation
, 2002
"... Wavelet methods have demonstrated considerable success in function estimation through termbyterm thresholding of empirical wavelet coecients. However, it has been shown that grouping the empirical wavelet coecients into blocks and making simultaneous threshold decisions about all coecients in e ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
Wavelet methods have demonstrated considerable success in function estimation through termbyterm thresholding of empirical wavelet coecients. However, it has been shown that grouping the empirical wavelet coecients into blocks and making simultaneous threshold decisions about all coecients in each block has a number of advantages over termbyterm thresholding, including asymptotic optimality and better mean squared error performance in nite sample situations. In this paper, we consider an empirical Bayes approach to incorporating information on neighbouring empirical wavelet coecients into function estimation that results in block shrinkage and block thresholding estimators. Simulated examples are used to illustrate the performance of the resulting estimators, and to compare these estimators with existing nonBayesian block thresholding estimators. It is observed that the proposed empirical Bayes block shrinkage and block thresholding estimators outperform existing block ...
Analytical form for a bayesian wavelet estimator of images using the bassel k forms densities
 IEEE Trans. Image Processing
"... Abstract—A novel Bayesian nonparametric estimator in the wavelet domain is presented. In this approach, a prior model is imposed on the wavelet coefficients designed to capture the sparseness of the wavelet expansion. Seeking probability models for the marginal densities of the wavelet coefficients, ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Abstract—A novel Bayesian nonparametric estimator in the wavelet domain is presented. In this approach, a prior model is imposed on the wavelet coefficients designed to capture the sparseness of the wavelet expansion. Seeking probability models for the marginal densities of the wavelet coefficients, the new family of Bessel K forms (BKF) densities are shown to fit very well to the observed histograms. Exploiting this prior, we designed a Bayesian nonlinear denoiser and we derived a closed form for its expression. We then compared it to other priors that have been introduced in the literature, such as the generalized Gaussian density (GGD) or thestable models, where no analytical form is available for the corresponding Bayesian denoisers. Specifically, the BKF model turns out to be a good compromise between these two extreme cases (hyperbolic tails for thestable and exponential tails for the GGD). Moreover, we demonstrate a high degree of match between observed and estimated prior densities using the BKF model. Finally, a comparative study is carried out to show the effectiveness of our denoiser which clearly outperforms the classical shrinkage or thresholding waveletbased techniques. Index Terms—Bayesian denoiser, Bessel K forms (BKF), posterior conditional mean, wavelets.
ΓMINIMAX WAVELET SHRINKAGE: A ROBUST INCORPORATION OF INFORMATION ABOUT ENERGY OF A SIGNAL IN DENOISING APPLICATIONS
 STATISTICA SINICA
, 2004
"... In this paper we propose a method for waveletfiltering of noisy signals when prior information about the L²energy of the signal of interest is available. Assuming the independence model, according to which the wavelet coefficients are treated individually, we propose a level dependent shrinkage r ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
In this paper we propose a method for waveletfiltering of noisy signals when prior information about the L²energy of the signal of interest is available. Assuming the independence model, according to which the wavelet coefficients are treated individually, we propose a level dependent shrinkage rule that turns out to be the Γminimax rule for a suitable class, say Γ, of realistic priors on the wavelet coefficients. The proposed methodology is particularly well suited for denoising tasks where signaltonoise ratio is low, and it is illustrated on a battery of standard test functions. Performance comparisons with some others methods existing in the literature are provided. An example in atomic force microscopy (AFM) is also discussed.