Results 1  10
of
477,502
Estimation and Inference in Econometrics
, 1993
"... The astonishing increase in computer performance over the past two decades has made it possible for economists to base many statistical inferences on simulated, or bootstrap, distributions rather than on distributions obtained from asymptotic theory. In this paper, I review some of the basic ideas o ..."
Abstract

Cited by 1151 (3 self)
 Add to MetaCart
The astonishing increase in computer performance over the past two decades has made it possible for economists to base many statistical inferences on simulated, or bootstrap, distributions rather than on distributions obtained from asymptotic theory. In this paper, I review some of the basic ideas
Bounded interpersonal inferences and decision making.
 ECONOMIC THEORY
, 2002
"... Individual decision making is based on predictions about other players’ choices as well as on valuations of reactions to predictions. In this sense, a player has a predictiondecision criterion for decision making. We develop a theory of predictiondecision criteria, which enables us to capture new ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
Individual decision making is based on predictions about other players’ choices as well as on valuations of reactions to predictions. In this sense, a player has a predictiondecision criterion for decision making. We develop a theory of predictiondecision criteria, which enables us to capture new phenomena on individual decision making in games. The decision making situation is described in the epistemic logic GLEF of shallow depths. There, each player considers his and other players ’ decision making down to some shallow depths. It is a point of our theory to investigate inferential complexities of interpersonal introspections. In particular, we can discuss a minimal epistemic inferential structure for predictiondecision making. We will find parallel structures in decision making and prediction making, which is called an inner parallelism. The climax of the paper is the consideration of inner parallelisms of predictiondecision making.
Conditional Lower Bounds . . . Probabilistic Inference
, 2009
"... The Inference problem in probabilistic networks (given a stochastic variable V, what is the posterior probability that V = v given evidence e?) has been proven to be intractable; in fact, has a PPcomplete decision variant [17]. The currently most efficient algorithms for this problem are all expone ..."
Abstract
 Add to MetaCart
exponential in the treewidth of the moralised graph of the network. We prove, using a recent result of Marx [18], that these algorithms are in some sense optimal: we prove a lower bound of f(G) ω( tw(G) log tw(G) ) for any algorithm solving arbitrary instances of Inference with graph G, unless the ETH fails
Reasoning the fast and frugal way: Models of bounded rationality
 Psychological Review
, 1996
"... Humans and animals make inferences about the world under limited time and knowledge. In contrast, many models of rational inference treat the mind as a Laplacean Demon, equipped with unlimited time, knowledge, and computational might. Following H. Simon’s notion of satisficing, the authors have prop ..."
Abstract

Cited by 583 (28 self)
 Add to MetaCart
Humans and animals make inferences about the world under limited time and knowledge. In contrast, many models of rational inference treat the mind as a Laplacean Demon, equipped with unlimited time, knowledge, and computational might. Following H. Simon’s notion of satisficing, the authors have
Dynamic Bayesian Networks: Representation, Inference and Learning
, 2002
"... Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have bee ..."
Abstract

Cited by 758 (3 self)
 Add to MetaCart
) space instead of O(T); a simple way of using the junction tree algorithm for online inference in DBNs; new complexity bounds on exact online inference in DBNs; a new deterministic approximate inference algorithm called factored frontier; an analysis of the relationship between the BK algorithm and loopy
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in largescale statistical models.
Loopy Belief Propagation for Approximate Inference: An Empirical Study
 In Proceedings of Uncertainty in AI
, 1999
"... Recently, researchers have demonstrated that "loopy belief propagation"  the use of Pearl's polytree algorithm in a Bayesian network with loops  can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannonlimit performa ..."
Abstract

Cited by 680 (18 self)
 Add to MetaCart
inference scheme in a more general setting? We compare the marginals computed using loopy propagation to the exact ones in four Bayesian network architectures, including two realworld networks: ALARM and QMR. We find that the loopy beliefs often converge and when they do, they give a good
Transductive Inference for Text Classification using Support Vector Machines
, 1999
"... This paper introduces Transductive Support Vector Machines (TSVMs) for text classification. While regular Support Vector Machines (SVMs) try to induce a general decision function for a learning task, Transductive Support Vector Machines take into account a particular test set and try to minimiz ..."
Abstract

Cited by 887 (4 self)
 Add to MetaCart
This paper introduces Transductive Support Vector Machines (TSVMs) for text classification. While regular Support Vector Machines (SVMs) try to induce a general decision function for a learning task, Transductive Support Vector Machines take into account a particular test set and try to minimize misclassifications of just those particular examples. The paper presents an analysis of why TSVMs are well suited for text classification. These theoretical findings are supported by experiments on three test collections. The experiments show substantial improvements over inductive methods, especially for small training sets, cutting the number of labeled training examples down to a twentieth on some tasks. This work also proposes an algorithm for training TSVMs efficiently, handling 10,000 examples and more.
Results 1  10
of
477,502