Results 1  10
of
14
Random number generation
"... Random numbers are the nuts and bolts of simulation. Typically, all the randomness required by the model is simulated by a random number generator whose output is assumed to be a sequence of independent and identically distributed (IID) U(0, 1) random variables (i.e., continuous random variables dis ..."
Abstract

Cited by 136 (30 self)
 Add to MetaCart
Random numbers are the nuts and bolts of simulation. Typically, all the randomness required by the model is simulated by a random number generator whose output is assumed to be a sequence of independent and identically distributed (IID) U(0, 1) random variables (i.e., continuous random variables distributed uniformly over the interval
A Natural Law of Succession
, 1995
"... Consider the following problem. You are given an alphabet of k distinct symbols and are told that the i th symbol occurred exactly ni times in the past. On the basis of this information alone, you must now estimate the conditional probability that the next symbol will be i. In this report, we presen ..."
Abstract

Cited by 35 (3 self)
 Add to MetaCart
Consider the following problem. You are given an alphabet of k distinct symbols and are told that the i th symbol occurred exactly ni times in the past. On the basis of this information alone, you must now estimate the conditional probability that the next symbol will be i. In this report, we present a new solution to this fundamental problem in statistics and demonstrate that our solution outperforms standard approaches, both in theory and in practice.
Spike: Intelligent scheduling of hubble space telescope observations
 Intelligent Scheduling
, 1994
"... ..."
Further Results on Bayesian Method of Moments Analysis of the Multiple Regression Model
 Internat. Econom. Rev
, 1998
"... The Bayesian Method of Moments (BMOM) was introduced in 1994 to permit investigators to make inverse probability statements regarding parameters' possible values given the data when the form of the likelihood function is unknown. BMOM has been applied in analyses of several statistical and econometr ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
The Bayesian Method of Moments (BMOM) was introduced in 1994 to permit investigators to make inverse probability statements regarding parameters' possible values given the data when the form of the likelihood function is unknown. BMOM has been applied in analyses of several statistical and econometric models including location, multiple and multivariate regression, and simultaneous equation models. In Zellner (1996, 1997a) and Zellner and Sacks (1996) some previous BMOM analyses of the multiple regression model have appeared that permit derivation of postdata densities for parameters and future observations to be calculated without use of a likelihood function, prior density, or Bayes' Theorem. In the present paper, we extend previous analyses by showing how information about a variance parameter and its relation to regression coefficients affects postdata densities. We also discuss estimation of functions of parameters and model selection techniques using BMOM and traditional Bayesi...
Bayesian Method of Moments (BMOM) Analysis of Parametric and Semiparametric Regression Models
 South African Statistical Journal
, 1997
"... The Bayesian Method of Moments is applied to semiparametric regression models using alternative series expansions of an unknown regression function. We describe estimation loss functions, predictive loss functions and posterior odds as techniques to determine how many terms in a particular expans ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The Bayesian Method of Moments is applied to semiparametric regression models using alternative series expansions of an unknown regression function. We describe estimation loss functions, predictive loss functions and posterior odds as techniques to determine how many terms in a particular expansion to keep and how to choose among different types of expansions. The developed theory is then applied in a MonteCarlo experiment to data generated from a CES production function. 1 Introduction In this paper, we take up the Bayesian Method of Moments (BMOM) analysis of parametric and semiparametric models. In previous work, Zellner (1994, 1995, 1996, 1997), Zellner and Sacks (1996), Tobias and Zellner (1997), Green and Strawderman (1996) and Currie (1996), the BMOM approach has been described and applied to parametric models. University of Chicago, University of Chicago, and Chung Ang University, respectively. Research financed in part by the National Science Foundation and by income ...
Neural networks and belief logic
 Proceedings of the Fourth International Conference on Hybrid Intelligent Systems (HIS’04), IEEE
, 2004
"... Many researchers have observed that neurons process information in an imprecise manner if a logical inference emerges from neural computation, it is inexact at best. Thus, there must be a profound relationship between belief logic and neural networks. In Chen (2002), a plausible neural network mode ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Many researchers have observed that neurons process information in an imprecise manner if a logical inference emerges from neural computation, it is inexact at best. Thus, there must be a profound relationship between belief logic and neural networks. In Chen (2002), a plausible neural network model that can compute probabilistic and possibilistic logic was proposed. In this article we further extend this model to continuous variables for function and relation estimation. We discuss why and how belief logic is derived from neural computation. 1.
“Not only defended but also applied”: The perceived absurdity of Bayesian inference
, 2011
"... Abstract. The missionary zeal of many Bayesians has been matched, in the other direction, by a view among some theoreticians that Bayesian methods are absurd—not merely misguided but obviously wrong in principle. We consider several examples, beginning with Feller’s classic text on probability theor ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. The missionary zeal of many Bayesians has been matched, in the other direction, by a view among some theoreticians that Bayesian methods are absurd—not merely misguided but obviously wrong in principle. We consider several examples, beginning with Feller’s classic text on probability theory and continuing with more recent cases such as the perceived Bayesian nature of the socalled doomsday argument. We analyze in this note the intellectual background behind various misconceptions about Bayesian statistics, without aiming at a complete historical coverage of the reasons for this dismissal.
Methods and Criteria for Model Selection
"... Model selection is an important part of any statistical analysis, and indeed is central to the pursuit of science in general. Many authors have examined this question, from both frequentist and Bayesian perspectives, and many tools for selecting the “best model ” have been suggested in the literatur ..."
Abstract
 Add to MetaCart
Model selection is an important part of any statistical analysis, and indeed is central to the pursuit of science in general. Many authors have examined this question, from both frequentist and Bayesian perspectives, and many tools for selecting the “best model ” have been suggested in the literature. This paper considers the various proposals from a Bayesian decision–theoretic perspective.
Relative Distributional Methods by
, 1997
"... Relative distribution methods are a non–parametric statistical framework for analyzing data in a fully distributional context. The methods combine the graphical tools of exploratory data analysis with a framework for statistical decomposition and inference. The relative distribution is similar to a ..."
Abstract
 Add to MetaCart
Relative distribution methods are a non–parametric statistical framework for analyzing data in a fully distributional context. The methods combine the graphical tools of exploratory data analysis with a framework for statistical decomposition and inference. The relative distribution is similar to a density ratio, and is based on the direct comparison of one distribution to another. It is technically defined as the random variable obtained by transforming a variable from a comparison group by the cumulative distribution function (CDF) of that variable for a reference group. This transformation produces a set of observations, the relative data, that represent the rank of the original comparison value in terms of the reference group’s CDF. The relative data preserve the information needed to compare the two original distributions. The density and CDF of the relative data can therefore be used to fully represent and analyze distributional differences. Analysis can move beyond comparisons of means and variances to fully tap the information inherent in distributions. The analytic framework is general and flexible, as the relative density is decomposable into location, shape and covariate effects.