Results 1  10
of
25
A Discriminative Framework for Detecting Remote Protein Homologies
, 1999
"... A new method for detecting remote protein homologies is introduced and shown to perform well in classifying protein domains by SCOP superfamily. The method is a variant of support vector machines using a new kernel function. The kernel function is derived from a generative statistical model for a ..."
Abstract

Cited by 193 (4 self)
 Add to MetaCart
A new method for detecting remote protein homologies is introduced and shown to perform well in classifying protein domains by SCOP superfamily. The method is a variant of support vector machines using a new kernel function. The kernel function is derived from a generative statistical model for a protein family, in this case a hidden Markov model. This general approach of combining generative models like HMMs with discriminative methods such as support vector machines may have applications in other areas of biosequence analysis as well.
Expert conciliation for multi modal person authentication systems by Bayesian statistics
, 1997
"... We present an algorithm functioning as a supervisor module in a multi expert decision making machine. It uses the Bayes theory in order to estimate the biases of individual expert opinions. These are then used to calibrate and conciliate expert opinions to one opinion. We present a framework for ..."
Abstract

Cited by 58 (14 self)
 Add to MetaCart
We present an algorithm functioning as a supervisor module in a multi expert decision making machine. It uses the Bayes theory in order to estimate the biases of individual expert opinions. These are then used to calibrate and conciliate expert opinions to one opinion. We present a framework for simulating decision strategies using expert opinions whose properties are easily modifiable. By using real data coming from a person authentication system using image and speech data we were able to confirm that the proposed supervisor improves the quality of individual expert decisions by reaching success rates of 99.5 %.
Statistical Methods for Eliciting Probability Distributions
 Journal of the American Statistical Association
, 2005
"... Elicitation is a key task for subjectivist Bayesians. While skeptics hold that it cannot (or perhaps should not) be done, in practice it brings statisticians closer to their clients and subjectmatterexpert colleagues. This paper reviews the stateoftheart, reflecting the experience of statisticia ..."
Abstract

Cited by 32 (1 self)
 Add to MetaCart
Elicitation is a key task for subjectivist Bayesians. While skeptics hold that it cannot (or perhaps should not) be done, in practice it brings statisticians closer to their clients and subjectmatterexpert colleagues. This paper reviews the stateoftheart, reflecting the experience of statisticians informed by the fruits of a long line of psychological research into how people represent uncertain information cognitively, and how they respond to questions about that information. In a discussion of the elicitation process, the first issue to address is what it means for an elicitation to be successful, i.e. what criteria should be employed? Our answer is that a successful elicitation faithfully represents the opinion of the person being elicited. It is not necessarily “true ” in some objectivistic sense, and cannot be judged that way. We see elicitation as simply part of the process of statistical modeling. Indeed in a hierarchical model it is ambiguous at which point the likelihood ends and the prior begins. Thus the same kinds of judgment that inform statistical modeling in general also inform elicitation of prior distributions.
Methods of combining multiple classifiers with different features and their applications to textindependent speaker identification
 Journal of Pattern Recognition and Artificial Intelligence
, 1997
"... In practical applications of pattern recognition, there are often different features extracted from raw data which needs recognizing. Methods of combining multiple classifiers with different features are viewed as a general problem in various application areas of pattern recognition. In this paper, ..."
Abstract

Cited by 27 (5 self)
 Add to MetaCart
In practical applications of pattern recognition, there are often different features extracted from raw data which needs recognizing. Methods of combining multiple classifiers with different features are viewed as a general problem in various application areas of pattern recognition. In this paper, a systematic investigation has been made and possible solutions are classified into three frameworks, i.e. linear opinion pools, winnertakeall and evidential reasoning. For combining multiple classifiers with different features, a novel method is presented in the framework of linear opinion pools and a modified training algorithm for associative switch is also proposed in the framework of winnertakeall. In the framework of evidential reasoning, several typical methods are briefly reviewed for use. All aforementioned methods have already been applied to textindependent speaker identification. The simulations show that results yielded by the methods described in this paper are better than not only the individual classifiers ’ but also ones obtained by combining multiple classifiers with the same feature. It indicates that the use of combining multiple classifiers with different features is an effective way to attack the problem of textindependent speaker identification.
SetBased Bayesianism
, 1992
"... . Problems for strict and convex Bayesianism are discussed. A setbased Bayesianism generalizing convex Bayesianism and intervalism is proposed. This approach abandons not only the strict Bayesian requirement of a unique realvalued probability function in any decisionmaking context but also the re ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
. Problems for strict and convex Bayesianism are discussed. A setbased Bayesianism generalizing convex Bayesianism and intervalism is proposed. This approach abandons not only the strict Bayesian requirement of a unique realvalued probability function in any decisionmaking context but also the requirement of convexity for a setbased representation of uncertainty. Levi's Eadmissibility decision criterion is retained and is shown to be applicable in the nonconvex case. Keywords: Uncertainty, decisionmaking, maximum entropy, Bayesian methods. 1. Introduction. The reigning philosophy of uncertainty representation is strict Bayesianism. One of its central principles is that an agent must adopt a single, realvalued probability function over the events recognized as relevant to a given problem. Prescriptions for defining such a function for a given agent in a given situation range from the extreme personalism of deFinetti (1964, 1974) and Savage (1972) to the objective Bayesianism of...
Inference for Deterministic Simulation Models: The Bayesian Melding Approach
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2000
"... Deterministic simulation models are used in many areas of science, engineering and policymaking. Typically, they are complex models that attempt to capture underlying mechanisms in considerable detail, and they have many userspecified inputs. The inputs are often specified by some form of trialan ..."
Abstract

Cited by 25 (4 self)
 Add to MetaCart
Deterministic simulation models are used in many areas of science, engineering and policymaking. Typically, they are complex models that attempt to capture underlying mechanisms in considerable detail, and they have many userspecified inputs. The inputs are often specified by some form of trialanderror approach in which plausible values are postulated, the corresponding outputs inspected, and the inputs modified until plausible outputs are obtained. Here we address the issue of more formal inference for such models. Raftery et al. (1995a) proposed the Bayesian synthesis approach in which the available information about both inputs and outputs was encoded in a probability distribution and inference was made by restricting this distribution to the submanifold specifid by the model. Wolpert (1995) showed that this is subject to the Borel paradox, according to which the results can depend on the parameterization of the model. We show that this dependence is due to the presence of a prior on the outputs. We propose a modified approach, called Bayesian melding, which takes full account of information and uncertainty about both inputs and outputs to the model, while avoiding the Borel paradox. This is done by recognizing the existence of two priors, one implicit and one explicit, on each input and output � these are combined via logarithmic pooling. Bayesian melding is then
The expected value of information and the probability of surprise
 Risk Anal
, 1999
"... Risk assessors attempting to use probabilistic approaches to describe uncertainty often find themselves in a datasparse situation: available data are only partially relevant to the parameter of interest, so one needs to adjust empirical distributions, use explicit judgmental distributions, or colle ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
Risk assessors attempting to use probabilistic approaches to describe uncertainty often find themselves in a datasparse situation: available data are only partially relevant to the parameter of interest, so one needs to adjust empirical distributions, use explicit judgmental distributions, or collect new data. In determining whether or not to collect additional data, whether by measurement or by elicitation of experts, it is useful to consider the expected value of the additional information. The expected value of information depends on the prior distribution used to represent current information; if the prior distribution is too narrow, in many riskanalytic cases the calculated expected value of information will be biased downward. The welldocumented tendency toward overconfidence, including the neglect of potential surprise, suggests this bias may be substantial. We examine the expected value of information, including the role of surprise, test for bias in estimating the expected value of information, and suggest procedures to guard against overconfidence and underestimation of the expected value of information when developing prior distributions and when combining distributions obtained from multiple experts. The methods are illustrated with applications to potential carcinogens in food, commercial energy demand, and global climate change. KEY WORDS: Probability; uncertainty; data; risk assessment. 1.
A Comparison between Single and Combined Backpropagation Neural Networks in the Prediction of Turnover.
 Applications of Artificial Intelligence
, 1998
"... Artificial neural networks are now being extensively used in the area of marketing analysis as they are well suited to this type of nonlinear problem. A retail company planned to improve its performance by using neural networks to predict turnover and data used in the experiment was provided by the ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Artificial neural networks are now being extensively used in the area of marketing analysis as they are well suited to this type of nonlinear problem. A retail company planned to improve its performance by using neural networks to predict turnover and data used in the experiment was provided by the company. The study compares the performance of a combination of neural networks to that of a single neural network. The results show that backpropagation neural networks are effective tools which can give good results in solving a nonlinear prediction problem, even when data is poorly represented. 1 Introduction In this section we look at the problem area of forecasting and discuss issues concerning sales forecasting in business. We also give a brief introduction to artificial neural networks. Finally, the structure of this paper is outlined. 1.1 Problem area Forecasting the future has always been an exciting and interesting problem for researchers. Understanding the world and its event...
Logarithmic Pooling of Priors Linked by a Deterministic Simulation Model
 Journal of Computational and Graphical Statistics
, 1999
"... We consider Bayesian inference when priors and likelihoods are both available for inputs and outputs of a deterministic simulation model. This problem is fundamentally related to the issue of aggregating (i.e. pooling) expert opinion. We survey alternative strategies for aggregation, then describe c ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We consider Bayesian inference when priors and likelihoods are both available for inputs and outputs of a deterministic simulation model. This problem is fundamentally related to the issue of aggregating (i.e. pooling) expert opinion. We survey alternative strategies for aggregation, then describe computational approaches for implementing pooled inference for simulation models. Our approach (1) numerically transforms all priors to the same space, (2) uses log pooling to combine priors, and (3) then draws standard Bayesian inference. We use importance sampling methods, including an iterative, adaptive approach which is more flexible and has less bias in some instances than a simpler alternative. Our exploratory examples are the first steps toward extension of the approach for highly complex and even noninvertible models. Key Words: Prior Coherization, Adaptive Importance Sampling, Bayesian Statistics, Model Inversion. 1 Introduction Much research of natural processes and systems is bas...
A Market Framework for Pooling Opinions
, 1998
"... Consider a group of Bayesians, each with a subjective probability distribution over a set of uncertain events. An opinion pool derives a single consensus distribution over the events, representative of the group as a whole. Several pooling functions have been proposed, each sensible under particular ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Consider a group of Bayesians, each with a subjective probability distribution over a set of uncertain events. An opinion pool derives a single consensus distribution over the events, representative of the group as a whole. Several pooling functions have been proposed, each sensible under particular assumptions or measures. Many researchers over many years have failed to form a consensus on which method is best. We propose a marketbased pooling procedure, and analyze its properties. Participants bet on securities, each paying off contingent on an uncertain event, so as to maximize their own expected utilities. The consensus probability of each event is defined as the corresponding security's equilibrium price. The market framework provides explicit monetary incentives for participation and honesty, and allows agents to maintain individual rationality and limited privacy. "No arbitrage" arguments ensure that the equilibrium prices form legal probabilities. We show that, when events are...