Results 1  10
of
43
Statistical Methods for Eliciting Probability Distributions
 Journal of the American Statistical Association
, 2005
"... Elicitation is a key task for subjectivist Bayesians. While skeptics hold that it cannot (or perhaps should not) be done, in practice it brings statisticians closer to their clients and subjectmatterexpert colleagues. This paper reviews the stateoftheart, reflecting the experience of statisticia ..."
Abstract

Cited by 39 (2 self)
 Add to MetaCart
Elicitation is a key task for subjectivist Bayesians. While skeptics hold that it cannot (or perhaps should not) be done, in practice it brings statisticians closer to their clients and subjectmatterexpert colleagues. This paper reviews the stateoftheart, reflecting the experience of statisticians informed by the fruits of a long line of psychological research into how people represent uncertain information cognitively, and how they respond to questions about that information. In a discussion of the elicitation process, the first issue to address is what it means for an elicitation to be successful, i.e. what criteria should be employed? Our answer is that a successful elicitation faithfully represents the opinion of the person being elicited. It is not necessarily “true ” in some objectivistic sense, and cannot be judged that way. We see elicitation as simply part of the process of statistical modeling. Indeed in a hierarchical model it is ambiguous at which point the likelihood ends and the prior begins. Thus the same kinds of judgment that inform statistical modeling in general also inform elicitation of prior distributions.
Imprecision in Engineering Design
 ASME JOURNAL OF MECHANICAL DESIGN
, 1995
"... Methods for incorporating imprecision in engineering design decisionmaking are briefly reviewed and compared. A tutorial is presented on the Method of Imprecision (MoI), a formal method, based on the mathematics of fuzzy sets, for representing and manipulating imprecision in engineering design. The ..."
Abstract

Cited by 37 (6 self)
 Add to MetaCart
Methods for incorporating imprecision in engineering design decisionmaking are briefly reviewed and compared. A tutorial is presented on the Method of Imprecision (MoI), a formal method, based on the mathematics of fuzzy sets, for representing and manipulating imprecision in engineering design. The results of a design cost estimation example, utilizing a new informal cost specification, are presented. The MoI can provide formal information upon which to base decisions during preliminary engineering design and can facilitate setbased concurrent design.
Measures of agreement between computation and experiment: Validation metrics
, 2006
"... With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables to sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric, as well as features that we believe should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent freeshear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.
Minimizing Information Overload: The Ranking of Electronic Messages
 Journal of Information Science
, 1989
"... The decision to examine a message at a particular point in time should be made rationally and economically if the message recipient is to operate efficiently. ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
The decision to examine a message at a particular point in time should be made rationally and economically if the message recipient is to operate efficiently.
Strategies for Revising Judgment: How (and How Well) People Use Others ’ Opinions
"... A basic issue in social influence is how best to change one’s judgment in response to learning the opinions of others. This article examines the strategies that people use to revise their quantitative estimates on the basis of the estimates of another person. The authors note that people tend to use ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
A basic issue in social influence is how best to change one’s judgment in response to learning the opinions of others. This article examines the strategies that people use to revise their quantitative estimates on the basis of the estimates of another person. The authors note that people tend to use 2 basic strategies when revising estimates: choosing between the 2 estimates and averaging them. The authors developed the probability, accuracy, redundancy (PAR) model to examine the relative effectiveness of these two strategies across judgment environments. A surprising result was that averaging was the more effective strategy across a wide range of commonly encountered environments. The authors observed that despite this finding, people tend to favor the choosing strategy. Most participants in these studies would have achieved greater accuracy had they always averaged. The identification of intuitive strategies, along with a formal analysis of when they are accurate, provides a basis for examining how effectively people use the judgments of others. Although a portfolio of strategies that includes averaging and choosing can be highly effective, the authors argue that people are not generally well adapted to the environment in terms of strategy selection.
Detection, measurement and gravitational radiation
 Phys. Rev
, 1992
"... The optimum design, construction, and use of the LIGO, VIRGO, or LAGOS gravitational radiation detectors depends upon accurate calculations of their sensitivity to different sources of radiation. Here I examine how to determine the sensitivity of these instruments to sources of gravitational radiati ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
The optimum design, construction, and use of the LIGO, VIRGO, or LAGOS gravitational radiation detectors depends upon accurate calculations of their sensitivity to different sources of radiation. Here I examine how to determine the sensitivity of these instruments to sources of gravitational radiation by considering the process by which data are analyzed in a noisy detector. The problem of detection (is a signal present in the output of the detector?) is separated from that of measurement (what are the parameters that characterize the signal in the detector output?). By constructing the probability that the detector output is consistent with the presence of a signal, I show how to quantify the uncertainty that the output contains a signal and is not simply noise. Proceeding further, I construct the probability distribution that the parameterization µ that characterizes the signal has a certain value. From the distribution and its mode I determine volumes V (P) in parameter space such that µ ∈ V (P) with probability P [owing to the random nature of the detector noise, the volumes V (P) are are always different, even for identical signals in the detector output], thus quantifying the uncertainty in the estimation of the signal parameterization. These techniques are suitable for analyzing the output of a noisy detector. If we are designing a detector, or determining the suitability of an existing detector for observing a new source, then we don’t have detector output to analyze but are interested in the “most likely ” response of the detector to a signal. I exploit the 1 techniques just described to determine the “most likely ” volumes V (P) for detector output that would result in a parameter probability distribution with given mode. Finally, as an example, I apply these techniques to determine the anticipated sensitivity of the LIGO and LAGOS detectors to the gravitational radiation from a perturbed Kerr black hole.
D: Genomescale protein function prediction in yeast Saccharomyces cerevisiae through integrating multiple sources of highthroughput data. Pac Symp Biocomput 2005
"... As we are moving into the post genomesequencing era, various highthroughput experimental techniques have been developed to characterize biological systems at the genome scale. Discovering new biological knowledge from highthroughput biological data is a major challenge for bioinformatics today. T ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
As we are moving into the post genomesequencing era, various highthroughput experimental techniques have been developed to characterize biological systems at the genome scale. Discovering new biological knowledge from highthroughput biological data is a major challenge for bioinformatics today. To address this challenge, we developed a Bayesian statistical method together with Boltzmann machine and simulated annealing for protein function prediction in the yeast Saccharomyces cerevisiae through integrating various highthroughput biological data, including protein binary interactions, protein complexes and microarray gene expression profiles. In our approach, we quantified the relationship between functional similarity and highthroughput data. Based on our method, 1802 out of 2280 unannotated proteins in the yeast were assigned functions systematically. The related computer package is available upon request. 1.
Probability and Measurement Uncertainty in Physics  a Bayesian Primer
, 1995
"... Bayesian statistics is based on the subjective definition of probability as "degree of belief " and on Bayes' theorem, the basic tool for assigning probabilities to hypotheses combining a priori judgements and experimental information. This was the original point of view of Bayes, ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Bayesian statistics is based on the subjective definition of probability as "degree of belief " and on Bayes' theorem, the basic tool for assigning probabilities to hypotheses combining a priori judgements and experimental information. This was the original point of view of Bayes, Bernoulli, Gauss, Laplace, etc. and contrasts with later "conventional" (pseudo)definitions of probabilities, which implicitly presuppose the concept of probability. These notes show that the Bayesian approach is the natural one for data analysis in the most general sense, and for assigning uncertainties to the results of physical measurements  while at the same time resolving philosophical aspects of the problems. The approach, although little known and usually misunderstood among the High Energy Physics community, has become the standard way of reasoning in several fields of research and has recently been adopted by the international metrology organizations in their recommendations for asses...
2008, “Application of Bayesian Filters to Heat Conduction
 Problem”, EngOpt 2008 – International Conference on Eng. Optimization, (ed: Herskovitz
"... In this paper we present a general description of state estimation problems within the Bayesian framework. State estimation problems are addressed in which evolution and measurement stochastic models are used to predict the dynamic behavior of physical systems. Specifically, the application of two B ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
In this paper we present a general description of state estimation problems within the Bayesian framework. State estimation problems are addressed in which evolution and measurement stochastic models are used to predict the dynamic behavior of physical systems. Specifically, the application of two Bayesian filters to linear and nonlinear unsteady heat conduction problems is demonstrated; a) the use of Kalman filter, and b) the use of Particle Filter with sampling importance resampling algorithm. Limitations of the filtering methodologies used in this work are presented involving different probability State estimation problems, also designated as nonstationary inverse problems [1], are of great interest in innumerable practical applications. In such kinds of problems, the available measured data is used together with prior knowledge about the physical phenomena and the measuring devices, in order to sequentially produce estimates of the
Learning and Diagnosis in Manufacturing Processes Through an Executable Bayesian
 13th International Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems IEA/AIE2000
"... Abstract. In this paper we present a novel approach to modelling a manufacturing process that allows one to learn about causal mechanisms of manufacturing defects through a Process Modelling and Executable Bayesian Network (PMEBN). The method combines probabilistic reasoning with time dependent para ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. In this paper we present a novel approach to modelling a manufacturing process that allows one to learn about causal mechanisms of manufacturing defects through a Process Modelling and Executable Bayesian Network (PMEBN). The method combines probabilistic reasoning with time dependent parameters which are of crucial interest to quality control in manufacturing environments. We demonstrate the concept through a case study of a caravan manufacturing line using inspection data. 1