Results 1  10
of
48
From Laplace To Supernova Sn 1987a: Bayesian Inference In Astrophysics
, 1990
"... . The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
. The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions to wellposed statistical problems, and is historically the original approach to statistics. The reasons for earlier rejection of Bayesian methods are discussed, and it is noted that the work of Cox, Jaynes, and others answers earlier objections, giving Bayesian inference a firm logical and mathematical foundation as the correct mathematical language for quantifying uncertainty. The Bayesian approaches to parameter estimation and model comparison are outlined and illustrated by application to a simple problem based on the gaussian distribution. As further illustrations of the Bayesian paradigm, Bayesian solutions to two interesting astrophysical problems are outlined: the measurement of wea...
Maximum Entropy MIMO Wireless Channel Models with Limited Information
 in Proc. MATHMOD Conference on Mathematical Modeling
, 2006
"... In this contribution, models of wireless channels are derived from the maximum entropy principle, for several cases where only limited information about the propagation environment is available. First, analytical models are derived for the cases where certain parameters (channel energy, average ener ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
In this contribution, models of wireless channels are derived from the maximum entropy principle, for several cases where only limited information about the propagation environment is available. First, analytical models are derived for the cases where certain parameters (channel energy, average energy, spatial correlation matrix) are known deterministically. Frequently, these parameters are unknown (typically because the received energy or the spatial correlation varies with the user position), but still known to represent meaningful system characteristics. In these cases, analytical channel models are derived by assigning entropymaximizing distributions to these parameters, and marginalizing them out. For the MIMO case with spatial correlation, we show that the distribution of the covariance matrices is conveniently handled through its eigenvalues. The entropymaximizing distribution of the covariance matrix is shown to be a Wishart distribution. Furthermore, the corresponding probability density function of the channel matrix is shown to be described analytically by a function of the channel Frobenius norm. This technique can provide channel models incorporating the effect of shadow fading and spatial correlation between antennas without the need to assume explicit values for these parameters. The results are compared in terms of mutual information to the classical i.i.d. Gaussian model.
Approximate Inference for Robust Gaussian Process Regression
, 2005
"... Abstract. Gaussian process (GP) priors have been successfully used in nonparametric Bayesian regression and classification models. Inference can be performed analytically only for the regression model with Gaussian noise. For all other likelihood models inference is intractable and various approxim ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract. Gaussian process (GP) priors have been successfully used in nonparametric Bayesian regression and classification models. Inference can be performed analytically only for the regression model with Gaussian noise. For all other likelihood models inference is intractable and various approximation techniques have been proposed. In recent years expectationpropagation (EP) has been developed as a general method for approximate inference. This article provides a general summary of how expectationpropagation can be used for approximate inference in Gaussian process models. Furthermore we present a case study describing its implementation for a new robust variant of Gaussian process regression. To gain further insights into the quality of the EP approximation we present experiments in which we compare to results obtained by Markov chain Monte Carlo (MCMC) sampling. 1 Introduction – Robustness & Bayesian Regression To solve a realworld regression problem the analyst should carefully screen the data and use all prior information at hand in order to choose an appropriate regression model. The model is selected so as to approximate the beliefs about the data generating process. A mismatch seems unavoidable in practice. Robust regression methods can be understood as attempts to limit undesired distractions and distortions
Moustakas, “A Maximum Entropy Characterization of Spatially Correlated
 MIMO Wireless Channels,” in Proc. IEEE Wireless Communications and Networking Conference (WCNC), Hong Kong
, 2007
"... Abstract — We investigate the problem of establishing the joint probability distribution of the entries of a MultipleInput MultipleOutput (MIMO) spatially correlated flatfading channel, when little or no information about the channel properties are available. We show that the entropy of a random ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract — We investigate the problem of establishing the joint probability distribution of the entries of a MultipleInput MultipleOutput (MIMO) spatially correlated flatfading channel, when little or no information about the channel properties are available. We show that the entropy of a random positive semidefinite matrix is maximized by the Wishart distribution. We subsequently obtain the Maximum Entropy distribution of the MIMO transfer matrix by establishing its distribution conditioned on the covariance, and by later marginalizing over the covariance matrix. The obtained distribution is isotropic, and is described analytically as a function of the Frobenius norm of the channel matrix. I.
Comparison of maximum entropy and higherorder entropy estimators
 Journal of Econometrics 107 (2002) 195
"... We show that the generalized maximum entropy (GME) is the only estimation method that is consistent with a set of five axioms. The GME estimator can be nested using a single parameter, α, into two more general classes of estimators: GMEα estimators. Each of these GMEα estimators violates one of th ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We show that the generalized maximum entropy (GME) is the only estimation method that is consistent with a set of five axioms. The GME estimator can be nested using a single parameter, α, into two more general classes of estimators: GMEα estimators. Each of these GMEα estimators violates one of the five basic axioms. However, smallsample simulations demonstrate that certain members of these GMEα classes of estimators may outperform the GME estimation rule.
An Efficient Robust Concept Exploration Method and Sequential Exploratory Experimental Design
, 2004
"... ..."
RESURRECTING LOGICAL PROBABILITY
 ERKENNTNIS
, 2001
"... The logical interpretation of probability, or “objective Bayesianism” – the theory that (some) probabilities are strictly logical degrees of partial implication – is defended. The main argument against it is that it requires the assignment of prior probabilities, and that any attempt to determine t ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The logical interpretation of probability, or “objective Bayesianism” – the theory that (some) probabilities are strictly logical degrees of partial implication – is defended. The main argument against it is that it requires the assignment of prior probabilities, and that any attempt to determine them by symmetry via a “principle of insufficient reason” inevitably leads to paradox. Three replies are advanced: that priors are imprecise or of little weight, so that disagreement about them does not matter, within limits; that it is possible to distinguish reasonable from unreasonable priors on logical grounds; and that in real cases disagreement about priors can usually be explained by differences in the background information. It is argued also that proponents of alternative conceptions of probability, such as frequentists, Bayesians and Popperians, are unable to avoid committing themselves to the basic principles of logical probability.
Field Theory entropy, the Htheorem and the renormalization group, Phys. Rev. D54
, 1996
"... We consider entropy and relative entropy in Field theory and establish relevant monotonicity properties with respect to the couplings. The relative entropy in a field theory with a hierarchy of renormalization group fixed points ranks the fixed points, the lowest relative entropy being assigned to t ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We consider entropy and relative entropy in Field theory and establish relevant monotonicity properties with respect to the couplings. The relative entropy in a field theory with a hierarchy of renormalization group fixed points ranks the fixed points, the lowest relative entropy being assigned to the highest multicritical point. We argue that as a consequence of a generalized H theorem Wilsonian RG flows induce an increase in entropy and propose the relative entropy as the natural quantity which increases from one fixed point to another in more than two dimensions. The concept of entropy was introduced by Clausius through the study of thermodynamical systems. However it was Boltzmann’s essential discovery that entropy is the natural quantity that bridges the microscopic and macroscopic descriptions of a system which gave it its modern interpretation. A more general definition, proposed by Gibbs allowed
Reliability of Computational Science
, 2006
"... Today’s computers allow us to simulate large, complex physical problems. Many times the mathematical models describing such problems are based on a relatively small amount of available information such as experimental measurements. The question arises whether the computed data could be used as the b ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Today’s computers allow us to simulate large, complex physical problems. Many times the mathematical models describing such problems are based on a relatively small amount of available information such as experimental measurements. The question arises whether the computed data could be used as the basis for decision in critical engineering, economic, medicine applications. The representative list of engineering accidents occurred in the past years and their reasons illustrates the question. The paper describes a general framework for Verification and Validation which deals with this question. The framework is then applied to an illustrative engineering problem, in which the basis for decision is a specific quantity of interest, namely the probability that the quantity does not exceed a given value. The V&V framework is applied and explained in detail. The result of the analysis is the computation of the failure probability as well as a quantification of the confidence in the computation, depending on the amount of available experimental data. 1