Results 1  10
of
4,122
Posterior probability maps and SPMs
 NeuroImage
, 2003
"... This technical note describes the construction of posterior probability maps that enable conditional or Bayesian inferences about regionally specific effects in neuroimaging. Posterior probability maps are images of the probability or confidence that an activation exceeds some specified threshold, g ..."
Abstract

Cited by 56 (9 self)
 Add to MetaCart
This technical note describes the construction of posterior probability maps that enable conditional or Bayesian inferences about regionally specific effects in neuroimaging. Posterior probability maps are images of the probability or confidence that an activation exceeds some specified threshold
Posterior Probability on Finite Set
, 2012
"... In [14] we formalized probability and probability distribution on a finite sample space. In this article first we propose a formalization of the class of finite sample spaces whose element’s probability distributions are equivalent with each other. Next, we formalize the probability measure of the ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
of the class of sample spaces we have formalized above. Finally, we formalize the sampling and posterior probability.
Posterior Probability Decoding, Confidence Estimation And System Combination
, 2000
"... In this paper the estimation of word posterior probabilities is discussed and their application in the CUHTK system used in the March 2000 Hub5 Conversational Telephone Speech evaluation is described. The word lattices produced by the Viterbi decoder were used to generate confusion networks, which ..."
Abstract

Cited by 73 (5 self)
 Add to MetaCart
In this paper the estimation of word posterior probabilities is discussed and their application in the CUHTK system used in the March 2000 Hub5 Conversational Telephone Speech evaluation is described. The word lattices produced by the Viterbi decoder were used to generate confusion networks, which
NGram Posterior Probabilities for Statistical Machine Translation
 in Proceedings of the NAACL Workshop on SMT
, 2006
"... Word posterior probabilities are a common approach for confidence estimation in automatic speech recognition and machine translation. We will generalize this idea and introduce ngram posterior probabilities and show how these can be used to improve translation quality. Additionally, we will int ..."
Abstract

Cited by 34 (4 self)
 Add to MetaCart
Word posterior probabilities are a common approach for confidence estimation in automatic speech recognition and machine translation. We will generalize this idea and introduce ngram posterior probabilities and show how these can be used to improve translation quality. Additionally, we
Estimating Posterior Probabilities In Classification Problems With Neural Networks
, 1996
"... Classification problems are used to determine the group membership of multidimensional objects and are prevalent in every organization and discipline. Central to the classification determination is the posterior probability. This paper introduces the theory and applications of the classification p ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Classification problems are used to determine the group membership of multidimensional objects and are prevalent in every organization and discipline. Central to the classification determination is the posterior probability. This paper introduces the theory and applications of the classification
D.M.J.: Classifier Conditional Posterior Probabilities
 In: Joint IAPR International Workshops on Advances in Pattern Recognition
, 1998
"... Classifiers based on probability density estimates can be used to find posterior probabilities for the objects to be classified. These probabilities can be used for rejection or for combining classifiers. Posterior probabilities for other classifiers, however, have to be conditional for the classifi ..."
Abstract

Cited by 23 (7 self)
 Add to MetaCart
Classifiers based on probability density estimates can be used to find posterior probabilities for the objects to be classified. These probabilities can be used for rejection or for combining classifiers. Posterior probabilities for other classifiers, however, have to be conditional
Generalizing swendsenwang to sampling arbitrary posterior probabilities
, 2007
"... Abstract—Many vision tasks can be formulated as graph partition problems that minimize energy functions. For such problems, the Gibbs sampler [9] provides a general solution but is very slow, while other methods, such as Ncut [24] and graph cuts [4], [22], are computationally effective but only work ..."
Abstract

Cited by 77 (17 self)
 Add to MetaCart
the split, merge, and regrouping of a “chunk ” of the graph, in contrast to Gibbs sampler that flips a single vertex.We prove that this algorithm simulates ergodic and reversibleMarkov chain jumps in the space of graph partitions and is applicable to arbitrary posterior probabilities or energy functions
Posterior Probability Intervals for Wavelet Thresholding
 J. Royal Statist. Soc. Ser. B
, 2001
"... We use cumulants to derive Bayesian credible intervals for wavelet regression estimates. The first four cumulants of the posterior distribution of the estimates are expressed in terms of the observed data and integer powers of the mother wavelet functions. These powers are closely approximated by li ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
We use cumulants to derive Bayesian credible intervals for wavelet regression estimates. The first four cumulants of the posterior distribution of the estimates are expressed in terms of the observed data and integer powers of the mother wavelet functions. These powers are closely approximated
Speech/music Discrimination Based On Posterior Probability Features
, 1999
"... A hybrid connectionistHMM speech recognizer uses a neural network acoustic classifier. This network estimates the posterior probability that the acoustic feature vectors at the current time step should be labelled as each of around 50 phone classes. We sought to exploit informal observations of the ..."
Abstract

Cited by 51 (9 self)
 Add to MetaCart
A hybrid connectionistHMM speech recognizer uses a neural network acoustic classifier. This network estimates the posterior probability that the acoustic feature vectors at the current time step should be labelled as each of around 50 phone classes. We sought to exploit informal observations
Results 1  10
of
4,122