Results 1 
8 of
8
Non parametric empirical Bayes and compound decision approaches to estimation of a high dimensional vector of normal means
, 2007
"... We consider the classical problem of estimating a vector µ = (µ1,...,µn) based on independent observations Yi ∼ N(µi,1), i = 1,...,n. Suppose µi, i = 1,...,n are independent realizations from a completely unknown G. We suggest an easily computed estimator ˆµ, such that the ratio of its risk E(ˆµ − µ ..."
Abstract

Cited by 27 (10 self)
 Add to MetaCart
We consider the classical problem of estimating a vector µ = (µ1,...,µn) based on independent observations Yi ∼ N(µi,1), i = 1,...,n. Suppose µi, i = 1,...,n are independent realizations from a completely unknown G. We suggest an easily computed estimator ˆµ, such that the ratio of its risk E(ˆµ − µ) 2 with that of the Bayes procedure approaches 1. A related compound decision result is also obtained. Our asymptotics is of a triangular array; that is, we allow the distribution G to depend on n. Thus, our theoretical asymptotic results are also meaningful in situations where the vector µ is sparse and the proportion of zero coordinates approaches 1. We demonstrate the performance of our estimator in simulations, emphasizing sparse setups. In “moderatelysparse ” situations, our procedure performs very well compared to known procedures tailored for sparse setups. It also adapts well to nonsparse situations.
The Poisson Compound Decision Problem Revisited
"... Abstract: The compound decision problem for a vector of independent Poisson random variables with possibly different means has half a century old solution. However, it appears that the classical solution needs smoothing adjustment even when there are many observations and relatively small means such ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract: The compound decision problem for a vector of independent Poisson random variables with possibly different means has half a century old solution. However, it appears that the classical solution needs smoothing adjustment even when there are many observations and relatively small means such that the empirical distribution is close to its mean. We discuss three such adjustments. We also present another approach that first transforms the problem into the normal compound decision problem. 1.
Bayesian Perspectives on Sparse Empirical Bayes Analysis (SEBA)
, 2010
"... We consider a joint processing of n independent similar sparse regression problems. Each is based on a sample (yi1, xi1)..., (yim, xim) of m i.i.d. observations from yi1 = x T i1 βi + εi1, yi1 ∈ R, xi1 ∈ R p, and εi1 ∼ N(0, σ 2), say. The dimension p is large enough so that the empirical risk minimi ..."
Abstract
 Add to MetaCart
(Show Context)
We consider a joint processing of n independent similar sparse regression problems. Each is based on a sample (yi1, xi1)..., (yim, xim) of m i.i.d. observations from yi1 = x T i1 βi + εi1, yi1 ∈ R, xi1 ∈ R p, and εi1 ∼ N(0, σ 2), say. The dimension p is large enough so that the empirical risk minimizer is not feasible. We consider, from a Bayesian point of view, three possible extensions of the lasso. Each of the three estimators, the lassoes, the group lasso, and the RING lasso, utilizes different assumptions on the relation between the n vectors β1,..., βn. “... and only a star or two set sparsedly in the vault of heaven; and you will find a sight as stimulating as the hoariest summit of the Alps. ” R. L. Stevenson 1
COMPOUND DECISION IN THE PRESENCE OF PROXIES
"... Abstract: We study the problem of incorporating covariates in a compound decision setup. It is desired to estimate the means of n response variables, which are independent and normally distributed, and each is accompanied by a vector of covariates. We suggest a method that involves nonparametric em ..."
Abstract
 Add to MetaCart
Abstract: We study the problem of incorporating covariates in a compound decision setup. It is desired to estimate the means of n response variables, which are independent and normally distributed, and each is accompanied by a vector of covariates. We suggest a method that involves nonparametric empirical Bayes techniques and may be viewed as a generalization of the celebrated FayHerriot (1979) method. Some optimality properties of our method are proved. We also compare it numerically with FayHerriot and other methods, in a real data situation, where the goal is to estimate certain proportions in many small areas (StatisticalAreas). We also demonstrate our approach through the baseball data set, originally analayzed by Brown(2010). Key words and phrases: compound decision, empirical Bayes
Empirical Bayes improvement of Kalman filter type of estimators
"... Abstract: We consider the problem of estimating the means µi of n random variables Yi ∼ N(µi, 1), i = 1,..., n. Assuming some structure on the µ process, e.g., a state space model, one may use a summary statistics for the contribution of the rest of the observations to the estimation of µi. The mos ..."
Abstract
 Add to MetaCart
Abstract: We consider the problem of estimating the means µi of n random variables Yi ∼ N(µi, 1), i = 1,..., n. Assuming some structure on the µ process, e.g., a state space model, one may use a summary statistics for the contribution of the rest of the observations to the estimation of µi. The most important example for this is the Kalman filter. We introduce a nonlinear improvement of the standard weighted average of the given summary statistics and Yi itself, using empirical Bayes methods. The improvement is obtained under mild assumptions. It is strict when the process that governs the states µ1,..., µn is not a linear Gaussian statespace model. We consider both the sequential and the retrospective estimation problems.