Results 1 
8 of
8
Nonlinear and NonGaussian StateSpace Modeling with Monte Carlo Techniques: A Survey and Comparative Study
 In Rao, C., & Shanbhag, D. (Eds.), Handbook of Statistics
, 2000
"... Since Kitagawa (1987) and Kramer and Sorenson (1988) proposed the filter and smoother using numerical integration, nonlinear and/or nonGaussian state estimation problems have been developed. Numerical integration becomes extremely computerintensive in the higher dimensional cases of the state vect ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
Since Kitagawa (1987) and Kramer and Sorenson (1988) proposed the filter and smoother using numerical integration, nonlinear and/or nonGaussian state estimation problems have been developed. Numerical integration becomes extremely computerintensive in the higher dimensional cases of the state vector. Therefore, to improve the above problem, the sampling techniques such as Monte Carlo integration with importance sampling, resampling, rejection sampling, Markov chain Monte Carlo and so on are utilized, which can be easily applied to multidimensional cases. Thus, in the last decade, several kinds of nonlinear and nonGaussian filters and smoothers have been proposed using various computational techniques. The objective of this paper is to introduce the nonlinear and nonGaussian filters and smoothers which can be applied to any nonlinear and/or nonGaussian cases. Moreover, by Monte Carlo studies, each procedure is compared by the root mean square error criterion.
On Markov Chain Monte Carlo Methods For Nonlinear And NonGaussian StateSpace Models
 Communications in Statistics, Simulation and Computation, Vol.28, No.4, pp.867
, 1999
"... In this paper, a nonlinear and/or nonGaussian smoother utilizing Markov chain Monte Carlo Methods is proposed, where the measurement and transition equations are specified in any general formulation and the error terms in the statespace model are not necessarily normal. The random draws are direct ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
In this paper, a nonlinear and/or nonGaussian smoother utilizing Markov chain Monte Carlo Methods is proposed, where the measurement and transition equations are specified in any general formulation and the error terms in the statespace model are not necessarily normal. The random draws are directly generated from the smoothing densities. For random number generation, the MetropolisHastings algorithm and the Gibbs sampling technique are utilized. The proposed procedure is very simple and easy for programming, compared with the existing nonlinear and nonGaussian smoothing techniques. Moreover, taking several candidates of the proposal density function, we examine precision of the proposed estimator.
The Elicitation of Probabilities A Review of the Statistical Literature
, 2005
"... “We live in an uncertain world, and probability risk assessment deals as directly with that fact as anything we do. Uncertainty arises partly because we are fallible. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
“We live in an uncertain world, and probability risk assessment deals as directly with that fact as anything we do. Uncertainty arises partly because we are fallible.
Nonlinear And NonGaussian State Estimation: A QuasiOptimal Estimator
, 1998
"... The rejection sampling filter and smoother, proposed by Tanizaki (1996, 1999), Tanizaki and Mariano (1998) and Hurzeler and Kunsch (1998), take a lot of time computationally. The Markov chain Monte Carlo smoother, developed by Carlin, Polson and Sto#er (1992), Carter and Kohn (1994, 1996) and Geweke ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The rejection sampling filter and smoother, proposed by Tanizaki (1996, 1999), Tanizaki and Mariano (1998) and Hurzeler and Kunsch (1998), take a lot of time computationally. The Markov chain Monte Carlo smoother, developed by Carlin, Polson and Sto#er (1992), Carter and Kohn (1994, 1996) and Geweke and Tanizaki (1999a, 1999b), does not show a good performance depending on nonlinearity and nonnormality of the system in the sense of the root mean square error criterion, which reason comes from slow convergence of the Gibbs sampler. Taking into account these problems, we propose the nonlinear and nonGaussian filter and smoother which have much less computational burden and give us relatively better state estimates, although the proposed estimator does not yield the optimal state estimates in the sense of the minimum mean square error. The proposed filter and smoother are called the quasioptimal filter and quasioptimal smoother in this paper. Finally, through some Monte Carlo studies, the quasioptimal filter and smoother are compared with the rejection sampling procedure and the Markov chain Monte Carlo procedure. 1
Estimation Of Unknown Parameters In Nonlinear And NonGaussian State Space Models
"... : For the last decade, various simulationbased nonlinear ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
: For the last decade, various simulationbased nonlinear
unknown title
, 2002
"... Fiducial pdf as special case in either Bayesian or frequentist approach; equivalence to the informationmetric “prior”; relevance to parameter estimation. ..."
Abstract
 Add to MetaCart
Fiducial pdf as special case in either Bayesian or frequentist approach; equivalence to the informationmetric “prior”; relevance to parameter estimation.
OVERVIEW OF PRINCIPLES OF STATISTICS
"... A summary of the basic principles of statistics. Both the Bayesian and Frequentist points of view are exposed. 1 The Problems that Statistics is supposed to Solve. Statistical problems can be grouped into five classes: Point Estimation: Find the “best ” value for a parameter. Interval Estimation: Fi ..."
Abstract
 Add to MetaCart
A summary of the basic principles of statistics. Both the Bayesian and Frequentist points of view are exposed. 1 The Problems that Statistics is supposed to Solve. Statistical problems can be grouped into five classes: Point Estimation: Find the “best ” value for a parameter. Interval Estimation: Find a range within which the true value should lie, with a given confidence. Hypothesis Testing: Compare two hypotheses. Find which one is better supported by the data. GoodnessofFit Testing: Find how well one hypothesis is supported by the data. Decision Making: Make the best decision, based on data. In the Frequentist methodology, this separation is especially important, and books on Statistics are often organized into chapters with just these titles. The reason for this importance is that often the same problem can be formulated in different ways so that it fits into different classes, but the fundamental question being asked is different in each class, so the resulting solution must be expected to be different. The lesson is: Make sure you know what question you want to ask, and then choose the appropriate methods for that question. And be aware that seemingly unimportant differences in the way a problem is posed can make large differences in the answer. The secret to getting the right answer is to understand the question. In the Bayesian methodology, this separation is much less important, and Bayesian treatments tend not to be organized in this way. Bayes ’ Theorem is the concept which unifies Bayesian inference, since the methods for solving problems in all classes are based on the same theorem. 2