Results 1 
8 of
8
Latent dirichlet allocation
 Journal of Machine Learning Research
, 2003
"... We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a threelevel hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, ..."
Abstract

Cited by 2350 (63 self)
 Add to MetaCart
We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a threelevel hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, in turn, modeled as an infinite mixture over an underlying set of topic probabilities. In the context of text modeling, the topic probabilities provide an explicit representation of a document. We present efficient approximate inference techniques based on variational methods and an EM algorithm for empirical Bayes parameter estimation. We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model. 1.
Framing human inference by coherence based probability logic
 Journal of Applied Logic
, 2009
"... We take coherence based probability logic as the basic reference theory to model human deductive reasoning. The conditional and probabilistic argument forms are explored. We give a brief overview of recent developments of combining logic and probability in psychology. A study on conditional inferenc ..."
Abstract

Cited by 10 (7 self)
 Add to MetaCart
We take coherence based probability logic as the basic reference theory to model human deductive reasoning. The conditional and probabilistic argument forms are explored. We give a brief overview of recent developments of combining logic and probability in psychology. A study on conditional inferences illustrates our approach. First steps towards a process model of conditional inferences conclude the paper. © 2007 Elsevier B.V. All rights reserved.
Testing the Untestable: Reliability in the 21st Century
 IEEE Transactions on Software Reliability
, 2002
"... and industry are relying more and more on science’s advanced methods to determine reliability. Unfortunately, political, economic, time, and other constraints imposed by the real world inhibit the ability of researchers to calculate reliability efficiently and accurately. Because of such constraints ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
and industry are relying more and more on science’s advanced methods to determine reliability. Unfortunately, political, economic, time, and other constraints imposed by the real world inhibit the ability of researchers to calculate reliability efficiently and accurately. Because of such constraints, reliability must undergo an evolutionary change. The first step in this evolution is to reinterpret the concept so that it meets the new century’s needs. The next step is to quantify reliability using both empirical methods and auxiliary data sources, such as expert knowledge, corporate memory, and mathematical modeling and simulation. 1
How to Deal with Partially Analyzed Acts? A Proposal
"... In some situations, a decision is best represented by an incompletely analyzed act: conditionally to a certain event, the consequences of the decision on subevents are perfectly known and uncertainty becomes expressable through probabilities, whereas the plausibility of this event itself remains va ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In some situations, a decision is best represented by an incompletely analyzed act: conditionally to a certain event, the consequences of the decision on subevents are perfectly known and uncertainty becomes expressable through probabilities, whereas the plausibility of this event itself remains vague and the decision outcome on the complementary event is imprecisely known. In this framework, we study an axiomatic decision model and prove a representation theorem. Decision criteria must aggregate partial evaluations consisting in: i) the conditional expected utility associated with the analyzed part of the decision and ii) the best and worst outcomes of its nonanalyzed part.
Form 836 (10/96) Testing the Untestable: Reliability in the 21 st Century
"... Department of Energy under contract W7405ENG36. By acceptance of this article, the publisher recognizes that the U.S. Government retains a nonexclusive, royaltyfree license to publish or reproduce the published form of this contribution, or to allow others to do so, for U.S. Government purposes. ..."
Abstract
 Add to MetaCart
Department of Energy under contract W7405ENG36. By acceptance of this article, the publisher recognizes that the U.S. Government retains a nonexclusive, royaltyfree license to publish or reproduce the published form of this contribution, or to allow others to do so, for U.S. Government purposes. Los Alamos National Laboratory requests that the publisher identify this article as work performed under the auspices of the U.S. Department of Energy. Los Alamos National Laboratory strongly supports academic freedom and a researcher's right to publish; as an institution, however, the Laboratory does not endorse the viewpoint of a publication or guarantee its technical correctness.
On Consistent and Calibrated Inference about the Parameters of Sampling Distributions
, 2005
"... The theory of probability, based on very general rules referred to as the CoxPólyaJaynes Desiderata, can be used both as a theory of random mass phenomena and as a quantitative theory of plausible inference about the parameters of sampling distributions. The existing applications of the Desiderata ..."
Abstract
 Add to MetaCart
The theory of probability, based on very general rules referred to as the CoxPólyaJaynes Desiderata, can be used both as a theory of random mass phenomena and as a quantitative theory of plausible inference about the parameters of sampling distributions. The existing applications of the Desiderata must be extended in order to allow for consistent inferences in the limit of complete a priori ignorance about the values of the parameters. Since the limits of consistent quantitative inference from incomplete information can clearly be established, the developed theory is necessarily an effective one. It is interesting to note that when applying the Desiderata strictly, we find no contradictions between the socalled Bayesian and frequentist schools of inductive reasoning.
CONSTRUCTIVE REPRESENTATION THEORY FOR THE FEYNMAN OPERATOR CALCULUS
, 2007
"... 1 2 GILL AND ZACHARY Abstract. In this paper, we survey recent progress on the constructive theory of the Feynman operator calculus. We first develop an operator version of the HenstockKurzweil integral, and a new Hilbert space that allows us to construct the elementary path integral in the manner ..."
Abstract
 Add to MetaCart
1 2 GILL AND ZACHARY Abstract. In this paper, we survey recent progress on the constructive theory of the Feynman operator calculus. We first develop an operator version of the HenstockKurzweil integral, and a new Hilbert space that allows us to construct the elementary path integral in the manner originally envisioned by Feynman. After developing our timeordered operator theory we extend a few of the important theorems of semigroup theory, including the HilleYosida theorem. As an application, we unify and extend the theory of timedependent parabolic and hyperbolic evolution equations. We then develop a general perturbation theory and use it to prove that all theories generated by semigroups are asympotic in the operatorvalued sense of Poincaré. This allows us to provide a general theory for the interaction representation of relativistic quantum theory. We then show that our theory can be reformulated as a physically motivated sum over paths, and use this version to extend the Feynman path integral to include more general interactions. Our approach is independent of the space of continuous functions and thus makes the question of the existence of a measure more of a natural expectation than a death blow to the foundations for the Feynman integral. 1.
Imprecise probabilities, bets and functional analytic methods in Lukasiewicz logic ∗
, 2011
"... In his foundation of probability theory, Bruno de Finetti devised a betting scheme where a bookmaker offers bets on the outcome of events φ occurring in the future. He introduced a criterion for coherent bookmaking, and showed that coherent betting odds are given by some probability distribution. Wh ..."
Abstract
 Add to MetaCart
In his foundation of probability theory, Bruno de Finetti devised a betting scheme where a bookmaker offers bets on the outcome of events φ occurring in the future. He introduced a criterion for coherent bookmaking, and showed that coherent betting odds are given by some probability distribution. While de Finetti dealt with yesno events and boolean propositional logic, Mundici generalized the theory to the continuous spectrum events formalized within Lukasiewicz logic. Both de Finetti and Mundici assume that the bookmaker/bettor roles can be interchanged. In this paper we deal with a more realistic situation, dropping the reversibility assumption. Working in the framework of Lukasiewicz logic, we introduce a coherence criterion for nonreversible bookmaking. Our main tool is given by ’imprecise probabilities’, which are formulated in terms either of compact convex sets of probabilities or equivalently in terms of suitable sublinear functionals (see Section 5). Our main result is Theorem 8.3 which states that our coherence criterion arises from imprecise probabilities just as de Finetti’s criterion arises from probabilities. Throughout, we will work with MValgebras. They play the same role for Lukasiewicz logic as Boolean algebras play for classical logic. Unital abelian latticeordered groups will provide an intermediate structure: while being categorically equivalent to MValgebras, they are more akin to the Banach space C(X). Functional analytic methods, developed in Section 6, are used for the proof of our main result. 1