Results 1 
9 of
9
Universal Discrete Denoising: Known Channel
 IEEE Trans. Inform. Theory
, 2003
"... A discrete denoising algorithm estimates the input sequence to a discrete memoryless channel (DMC) based on the observation of the entire output sequence. For the case in which the DMC is known and the quality of the reconstruction is evaluated with a given singleletter fidelity criterion, we pr ..."
Abstract

Cited by 79 (32 self)
 Add to MetaCart
A discrete denoising algorithm estimates the input sequence to a discrete memoryless channel (DMC) based on the observation of the entire output sequence. For the case in which the DMC is known and the quality of the reconstruction is evaluated with a given singleletter fidelity criterion, we propose a discrete denoising algorithm that does not assume knowledge of statistical properties of the input sequence. Yet, the algorithm is universal in the sense of asymptotically performing as well as the optimum denoiser that knows the input sequence distribution, which is only assumed to be stationary and ergodic. Moreover, the algorithm is universal also in a semistochastic setting, in which the input is an individual sequence, and the randomness is due solely to the channel noise.
Universal filtering via prediction
 IEEE Trans. Inform. Theory
, 2007
"... We consider the filtering problem, where a finitealphabet individual sequence is corrupted by a discrete memoryless channel, and the goal is to causally estimate each sequence component based on the past and present noisy observations. We establish a correspondence between the filtering problem and ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
We consider the filtering problem, where a finitealphabet individual sequence is corrupted by a discrete memoryless channel, and the goal is to causally estimate each sequence component based on the past and present noisy observations. We establish a correspondence between the filtering problem and the problem of prediction of individual sequences which leads to the following result: Given an arbitrary finite set of filters, there exists a filter which performs, with high probability, essentially as well as the best in the set, regardless of the underlying noiseless individual sequence. We use this relationship between the problems to derive a filter guaranteed of attaining the “finitestate filterability ” of any individual sequence by leveraging results from the prediction problem. 1
Asymptotic efficiency of simple decisions for the compound decision problem
"... Abstract: We consider the compound decision problem of estimating a vector of n parameters, known up to a permutation, corresponding to n independent observations, and discuss the difference between two symmetric classes of estimators. The first and larger class is restricted to the set of all permu ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Abstract: We consider the compound decision problem of estimating a vector of n parameters, known up to a permutation, corresponding to n independent observations, and discuss the difference between two symmetric classes of estimators. The first and larger class is restricted to the set of all permutation invariant estimators. The second class is restricted further to simple symmetric procedures. That is, estimators such that each parameter is estimated by a function of the corresponding observation alone. We show that under mild conditions, the minimal total squared error risks over these two classes are asymptotically equivalent up to essentially O(1) difference.
GENERAL MAXIMUM LIKELIHOOD EMPIRICAL BAYES ESTIMATION OF NORMAL MEANS
, 908
"... We propose a general maximum likelihood empirical Bayes (GMLEB) method for the estimation of a mean vector based on observations with i.i.d. normal errors. We prove that under mild moment conditions on the unknown means, the average mean squared error (MSE) of the GMLEB is within an infinitesimal f ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We propose a general maximum likelihood empirical Bayes (GMLEB) method for the estimation of a mean vector based on observations with i.i.d. normal errors. We prove that under mild moment conditions on the unknown means, the average mean squared error (MSE) of the GMLEB is within an infinitesimal fraction of the minimum average MSE among all separable estimators which use a single deterministic estimating function on individual observations, provided that the risk is of greater order than (log n) 5 /n. We also prove that the GMLEB is uniformly approximately minimax in regular and weak ℓp balls when the order of the lengthnormalized norm of the unknown means is between (log n) κ1 /n
Sparse Empirical Bayes Analysis (SEBA)
, 2010
"... We consider a joint processing of n independent sparse regression problems. Eachis based on a sample (yi1,xi1)...,(yim,xim) of m i.i.d. observationsfrom yi1 = x T i1 βi+εi1, yi1 ∈ R, xi1 ∈ R p, i = 1,...,n, and εi1 ∼ N(0,σ 2), say. p is large enough so that the empirical risk minimizer is not consis ..."
Abstract
 Add to MetaCart
We consider a joint processing of n independent sparse regression problems. Eachis based on a sample (yi1,xi1)...,(yim,xim) of m i.i.d. observationsfrom yi1 = x T i1 βi+εi1, yi1 ∈ R, xi1 ∈ R p, i = 1,...,n, and εi1 ∼ N(0,σ 2), say. p is large enough so that the empirical risk minimizer is not consistent. We consider three possible extensions of the lasso estimator to deal with this problem, the lassoes, the group lasso and the RING lasso, each utilizing a different assumption how these problems are related. For each estimator we give a Bayesian interpretation, and we present both persistency analysis and nonasymptotic error bounds based on restricted eigenvalue type assumptions. “...and only a star or two set sparsedly in the vault of heaven; and you will find a sight as stimulating as the hoariest summit of the Alps. ” R. L. Stevenson 1
Bayesian Perspectives on Sparse Empirical Bayes Analysis (SEBA)
, 2010
"... We consider a joint processing of n independent similar sparse regression problems. Each is based on a sample (yi1, xi1)..., (yim, xim) of m i.i.d. observations from yi1 = x T i1 βi + εi1, yi1 ∈ R, xi1 ∈ R p, and εi1 ∼ N(0, σ 2), say. The dimension p is large enough so that the empirical risk minimi ..."
Abstract
 Add to MetaCart
We consider a joint processing of n independent similar sparse regression problems. Each is based on a sample (yi1, xi1)..., (yim, xim) of m i.i.d. observations from yi1 = x T i1 βi + εi1, yi1 ∈ R, xi1 ∈ R p, and εi1 ∼ N(0, σ 2), say. The dimension p is large enough so that the empirical risk minimizer is not feasible. We consider, from a Bayesian point of view, three possible extensions of the lasso. Each of the three estimators, the lassoes, the group lasso, and the RING lasso, utilizes different assumptions on the relation between the n vectors β1,..., βn. “... and only a star or two set sparsedly in the vault of heaven; and you will find a sight as stimulating as the hoariest summit of the Alps. ” R. L. Stevenson 1
unknown title
, 802
"... Asymptotic efficiency of simple decisions for the compound decision problem ∗ ..."
Abstract
 Add to MetaCart
Asymptotic efficiency of simple decisions for the compound decision problem ∗
6. (algorithm) Discrete Universal DEnoiser
"... 3. (slang) A term of address for a man. 4. (archaic) A dandy, a man who is very concerned about his dress and appearance. 5. (slang) A cool person of either sex. ..."
Abstract
 Add to MetaCart
3. (slang) A term of address for a man. 4. (archaic) A dandy, a man who is very concerned about his dress and appearance. 5. (slang) A cool person of either sex.
DOI: 10.1214/09STS283 © Institute of Mathematical Statistics, 2010 A Conversation with
"... Abstract. Jim Hannan is a professor who has lived an interesting life and one whose fundamental research in repeated games was not fully appreciated until late in his career. During his service as a meteorologist in the Army in World War II, Jim played poker and made weather forecasts. It is curious ..."
Abstract
 Add to MetaCart
Abstract. Jim Hannan is a professor who has lived an interesting life and one whose fundamental research in repeated games was not fully appreciated until late in his career. During his service as a meteorologist in the Army in World War II, Jim played poker and made weather forecasts. It is curious that his later research included strategies for repeated play that apply to selecting the best forecaster.