An Invariant Bayesian Model Selection Principle for Gaussian Data
user correction - Legacy Corrections
We develop a code length principle which is invariant to the choice of parameterization on the model distributions. An invariant approximation formula for easy computation of the marginal distribution is provided for gaussian likelihood models. We provide invariant estimators of the model parameters and formulate conditions under which these estimators are essentially posteriori unbiased for gaussian models. An upper bound on the coarseness of discretization on the model parameters is deduced. We introduce a discrimination measure between probability distributions and use it to construct probability distributions on model classes. The total code length is shown to be closely related to the NML code length of Rissanen when choosing Jeffreys prior distribution on the model parameters together with a uniform prior distribution on the model classes. Our model selection principle is applied to a gaussian estimation problem for data in a wavelet representation and its performance is tested and compared to alternative wavelet-based estimation methods in numerical experiments.