Results 1 
7 of
7
From Laplace To Supernova Sn 1987a: Bayesian Inference In Astrophysics
, 1990
"... . The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions ..."
Abstract

Cited by 52 (2 self)
 Add to MetaCart
. The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions to wellposed statistical problems, and is historically the original approach to statistics. The reasons for earlier rejection of Bayesian methods are discussed, and it is noted that the work of Cox, Jaynes, and others answers earlier objections, giving Bayesian inference a firm logical and mathematical foundation as the correct mathematical language for quantifying uncertainty. The Bayesian approaches to parameter estimation and model comparison are outlined and illustrated by application to a simple problem based on the gaussian distribution. As further illustrations of the Bayesian paradigm, Bayesian solutions to two interesting astrophysical problems are outlined: the measurement of wea...
The Maximum Entropy Approach and Probabilistic IR Models
 ACM TRANSACTIONS ON INFORMATION SYSTEMS
, 1998
"... The Principle of Maximum Entropy is discussed and two classic probabilistic models of information retrieval, the Binary Independence Model of Robertson and Sparck Jones and the Combination Match Model of Croft and Harper are derived using the maximum entropy approach. The assumptions on which the cl ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
The Principle of Maximum Entropy is discussed and two classic probabilistic models of information retrieval, the Binary Independence Model of Robertson and Sparck Jones and the Combination Match Model of Croft and Harper are derived using the maximum entropy approach. The assumptions on which the classical models are based are not made. In their place, the probability distribution of maximum entropy consistent with a set of constraints is determined. It is argued that this subjectivist approach is more philosophically coherent than the frequentist conceptualization of probability that is often assumed as the basis of probabilistic modeling and that this philosophical stance has important practical consequences with respect to the realization of information retrieval research.
How To Separate The Signal From The Background
, 1996
"... In many applications like radio astronomy and electron spectroscopy three sources contribute to the measured signal: the desired image, the noise, and an unknown slowly varying background. The inferential problem is to separate the signal from the noise and the background. We accomplish this task em ..."
Abstract
 Add to MetaCart
In many applications like radio astronomy and electron spectroscopy three sources contribute to the measured signal: the desired image, the noise, and an unknown slowly varying background. The inferential problem is to separate the signal from the noise and the background. We accomplish this task employing Bayesian probability theory.
SplineBased Adaptive Resolution Image
"... Bayesian probability theory allows to infer an image given data constraints, prior knowledge, and background information. One ingredient of the background information is usually paid little attention, namely the imagegrid, i.e. the points on which the desired image is reconstructed. In many problem ..."
Abstract
 Add to MetaCart
Bayesian probability theory allows to infer an image given data constraints, prior knowledge, and background information. One ingredient of the background information is usually paid little attention, namely the imagegrid, i.e. the points on which the desired image is reconstructed. In many problems an equidistant mesh is used. The choice of the imagegrid can, however, strongly influence the reconstruction: if the grid is too coarse accuracy is wasted, if it is too fine artificial structures due to ringing and noisefitting can show up. In order to achieve the best resolution supported by the data we include the grid into the Bayesian analysis and allowed for locally varying resolution. We applied our procedure to one dimensional problems. The image is reconstructed at the imagegrid and interpolated in the interstitial regions by cubic splines. The bayesian analysis contains two competing tendencies: the dataconstraints tend towards a fine grid as it allows to reduce the misfit, while Ockham's factor favors a coarse grid to keep the image "simple". The Bayesian solution represents a tradeoff between the two trends and leads to results which are significantly improved over those obtained by fixedgrid approaches: overfitting is eliminated and ringing is strongly suppressed, while the sharp structures are improved considerably. We applied the adaptive resolution idea to different types of problems such as deconvolution and density estimation. For both applications we present a representative physical problem.
W. Von Der Linden, V. Dose And A. Ramaswami
"... In many fields of research the following problem is encountered: a large collection of data is given for which a detailed theory is yet missing. To gain insight into the underlying problem it is important to reveal the interrelationships in the data and to determine the relevant input and response q ..."
Abstract
 Add to MetaCart
In many fields of research the following problem is encountered: a large collection of data is given for which a detailed theory is yet missing. To gain insight into the underlying problem it is important to reveal the interrelationships in the data and to determine the relevant input and response quantities. A central part of this task is to find the natural splitting of the data into groups and to analyze the respective characteristics. Bayesian probability theory is invoked for a consistent treatment of these problems. Due to Ockham's Razor, which is an integral part of the theory, the simplest group configuration that still fits the data has the highest probability. In addition the Bayesian approach allows to eliminate outliers, which otherwise could lead to erroneous conclusions. Simple textbook and mock data sets are analyzed in order to assess the Bayesian approach.
The Bayesian Land Surface Temperature estimator previously
, 2005
"... developed has been extended to include the effects of imperfectly known gain and offset calibration errors. It is possible to treat both gain and offset as nuisance parameters and, by integrating over an uninformitave range for their magnitudes, eliminate the dependence of surface temperature and em ..."
Abstract
 Add to MetaCart
developed has been extended to include the effects of imperfectly known gain and offset calibration errors. It is possible to treat both gain and offset as nuisance parameters and, by integrating over an uninformitave range for their magnitudes, eliminate the dependence of surface temperature and emissivity estimates upon the exact calibration error.
Bayesian Estimation for Land Surface Temperature Retrieval: The Nuisance of Emissivities
, 2004
"... approach to the remote sensing of land surface temperature is developed using the methods of Bayesian inference. The starting point is the maximum entropy estimate for the posterior distribution of radiance in multiple bands. In order to convert this quantity to an estimator for surface temperature ..."
Abstract
 Add to MetaCart
approach to the remote sensing of land surface temperature is developed using the methods of Bayesian inference. The starting point is the maximum entropy estimate for the posterior distribution of radiance in multiple bands. In order to convert this quantity to an estimator for surface temperature and emissivity with Bayes’ theorem, it is necessary to obtain the joint prior probability for surface temperature and emissivity, given available prior knowledge. The requirement that any pair of distinct observers be able to relate their descriptions of radiance under arbitrary Lorentz transformations uniquely determines the prior probability. Perhaps surprisingly, surface temperature acts as a scale parameter, while emissivity acts as a location parameter, giving the prior probability P(T, ǫ  K) = const T dTdǫ Given this result, it is a simple matter to construct estimators for surface temperature and emissivity. A Monte Carlo simulation of land surface temperature retrieval in selected MODIS bands is presented as an example of the utility of the approach.