Results 1 
2 of
2
Bayesian Interpolation
 Neural Computation
, 1991
"... Although Bayesian analysis has been in use since Laplace, the Bayesian method of modelcomparison has only recently been developed in depth. In this paper, the Bayesian approach to regularisation and modelcomparison is demonstrated by studying the inference problem of interpolating noisy data. T ..."
Abstract

Cited by 520 (18 self)
 Add to MetaCart
Although Bayesian analysis has been in use since Laplace, the Bayesian method of modelcomparison has only recently been developed in depth. In this paper, the Bayesian approach to regularisation and modelcomparison is demonstrated by studying the inference problem of interpolating noisy data. The concepts and methods described are quite general and can be applied to many other problems. Regularising constants are set by examining their posterior probability distribution. Alternative regularisers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them. `Occam's razor' is automatically embodied by this framework. The way in which Bayes infers the values of regularising constants and noise levels has an elegant interpretation in terms of the effective number of parameters determined by the data set. This framework is due to Gull and Skilling. 1 Data modelling and Occam's razor In science, a central task is to develop and compare models to a...
Integrated Surface Model Optimization for Freehand
"... Abstract—The major obstacle of threedimensional (3D) echocardiography is that the ultrasound image quality is too low to reliably detect features locally. Almost all available surfacefinding algorithms depend on decent quality boundaries to get satisfactory surface models. We formulate the surface ..."
Abstract
 Add to MetaCart
Abstract—The major obstacle of threedimensional (3D) echocardiography is that the ultrasound image quality is too low to reliably detect features locally. Almost all available surfacefinding algorithms depend on decent quality boundaries to get satisfactory surface models. We formulate the surface model optimization problem in a Bayesian framework, such that the inference made about a surface model is based on the integration of both the lowlevel image evidence and the highlevel prior shape knowledge through a pixel class prediction mechanism. We model the probability of pixel classes instead of making explicit decisions about them. Therefore, we avoid the unreliable edge detection or image segmentation problem and the pixel correspondence problem. An optimal surface model best explains the observed images such that the posterior probability of the surface model for the observed images is maximized. The pixel feature vector as the image evidence includes several parameters such as the smoothed grayscale value and the minimal second directional derivative. Statistically, we describe the feature vector by the pixel appearance probability model obtained by a nonparametric optimal quantization technique. Qualitatively, we display the imaging plane intersections of the optimized surface models together with those of the groundtruth surfaces reconstructed from manual delineations. Quantitatively, we measure the projection distance error between the optimized and the groundtruth surfaces. In our experiment, we use 20 studies to obtain the probability models offline. The prior shape knowledge is represented by a catalog of 86 left ventricle surface models. In another set of 25 test studies, the average epicardial and endocardial surface projection distance errors are 3.2 0.85 mm and 2.6 0.78 mm, respectively. Index Terms—Echocardiography, image analysis, image shape analysis, surface reconstruction. I.