Results 1  10
of
15
Bayesian Model Assessment In Factor Analysis
, 2004
"... Factor analysis has been one of the most powerful and flexible tools for assessment of multivariate dependence and codependence. Loosely speaking, it could be argued that the origin of its success rests in its very exploratory nature, where various kinds of datarelationships amongst the variable ..."
Abstract

Cited by 58 (8 self)
 Add to MetaCart
Factor analysis has been one of the most powerful and flexible tools for assessment of multivariate dependence and codependence. Loosely speaking, it could be argued that the origin of its success rests in its very exploratory nature, where various kinds of datarelationships amongst the variables at study can be iteratively verified and/or refuted. Bayesian inference in factor analytic models has received renewed attention in recent years, partly due to computational advances but also partly to applied focuses generating factor structures as exemplified by recent work in financial time series modeling. The focus of our current work is on exploring questions of uncertainty about the number of latent factors in a multivariate factor model, combined with methodological and computational issues of model specification and model fitting. We explore reversible jump MCMC methods that build on sets of parallel Gibbs samplingbased analyses to generate suitable empirical proposal distributions and that address the challenging problem of finding e#cient proposals in highdimensional models. Alternative MCMC methods based on bridge sampling are discussed, and these fully Bayesian MCMC approaches are compared with a collection of popular model selection methods in empirical studies.
Bayesian dynamic factor models and variance matrix discounting for portfolio allocation
 Journal of Business and Economic Statistics
, 2000
"... We discuss the development of dynamic factor models for multivariate nancial time series, and the incorporation of stochastic volatility components for latent factor processes. Bayesian inference and computation is developed and explored in a study of the dynamic factor structure of daily spot excha ..."
Abstract

Cited by 46 (9 self)
 Add to MetaCart
We discuss the development of dynamic factor models for multivariate nancial time series, and the incorporation of stochastic volatility components for latent factor processes. Bayesian inference and computation is developed and explored in a study of the dynamic factor structure of daily spot exchange rates for a selection of international currencies. The models are direct generalisations of univariate stochastic volatility models, and represent speci c varieties of models recently discussed in the growing multivariate stochastic volatility literature. We also discuss connections and comparisons with the much simpler method of dynamic variance discounting that, for over a decade, has been a standard approach in applied nancial econometrics in the Bayesian forecasting world. We review empirical ndings in applying these models to the exchange rate series, including aspects of model performance in dynamic portfolio allocation. We conclude with comments on the potential practical utility of structured factor models and future potential developments and model extensions.
Bayesian Estimation and Testing of Structural Equation Models
 Psychometrika
, 1999
"... The Gibbs sampler can be used to obtain samples of arbitrary size from the posterior distribution over the parameters of a structural equation model (SEM) given covariance data and a prior distribution over the parameters. Point estimates, standard deviations and interval estimates for the parameter ..."
Abstract

Cited by 27 (8 self)
 Add to MetaCart
The Gibbs sampler can be used to obtain samples of arbitrary size from the posterior distribution over the parameters of a structural equation model (SEM) given covariance data and a prior distribution over the parameters. Point estimates, standard deviations and interval estimates for the parameters can be computed from these samples. If the prior distribution over the parameters is uninformative, the posterior is proportional to the likelihood, and asymptotically the inferences based on the Gibbs sample are the same as those based on the maximum likelihood solution, e.g., output from LISREL or EQS. In small samples, however, the likelihood surface is not Gaussian and in some cases contains local maxima. Nevertheless, the Gibbs sample comes from the correct posterior distribution over the parameters regardless of the sample size and the shape of the likelihood surface. With an informative prior distribution over the parameters, the posterior can be used to make inferences about the parameters of underidentified models, as we illustrate on a simple errorsinvariables model.
A Bayesian Approach To Source Separation
 in Proceedings of The Nineteenth International Conference on Maximum Entropy and Bayesian Methods
, 1999
"... Source separation is one of the signal processing's main emerging domain. Many techniques such as maximum likelihood (ML), Infomax, cumulant matching, estimating function, etc. have been used to address this difficult problem. Unfortunately, up to now, many of these methods could not account complet ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
Source separation is one of the signal processing's main emerging domain. Many techniques such as maximum likelihood (ML), Infomax, cumulant matching, estimating function, etc. have been used to address this difficult problem. Unfortunately, up to now, many of these methods could not account completely for noise on the data, for different number of sources and sensors, for lack of spatial independence and for time correlation of the sources. Recently, the Bayesian approach has been used to push farther these limitations of the conventional methods. This paper proposes a unifying approach to source separation based on the Bayesian estimation. We first show that this approach gives the possibility to explain easily the major known techniques in sources separation as special cases. Then we propose new methods based on maximum a posteriori (MAP) estimation, either to estimate directly the sources, or the mixing matrices or even both. Key words: Sources separation, Bayesian estimation 1.
Correlated Bayesian Factor Analysis
, 1998
"... Factor analysis is a method in multivariate statistical analysis that can help scientists determine which variables to study in a field and their relationships. We extend the Bayesian approach to factor analysis developed in 1989 by Press and Shigemasu (henceforth PS89) and revised in 1997 to model ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
Factor analysis is a method in multivariate statistical analysis that can help scientists determine which variables to study in a field and their relationships. We extend the Bayesian approach to factor analysis developed in 1989 by Press and Shigemasu (henceforth PS89) and revised in 1997 to model correlated observation vectors, factor score vectors, and factor loadings. Further, we place a prior distribution on the number of factors and obtain posterior estimates. Hitherto,
Gibbs Sampling and Hill Climbing in Bayesian Factor Analysis
, 1998
"... Press and Shigemasu, 1989, proposed a Bayesian factor analysis model. Factor scores, factor loadings, and disturbance variances and covariances were estimated in closed form using a large sample approximation for one of the terms in the posterior distribution. This paper shows that by using Gibb ..."
Abstract

Cited by 13 (11 self)
 Add to MetaCart
Press and Shigemasu, 1989, proposed a Bayesian factor analysis model. Factor scores, factor loadings, and disturbance variances and covariances were estimated in closed form using a large sample approximation for one of the terms in the posterior distribution. This paper shows that by using Gibbs sampling or Lindley/Smith optimization ap proaches to estimation instead of the large sample approximation, both of which are possible in this model, we can obtain improved point estimators in small samples.
Bayesian analysis of mixtures of factor analyzers
 Neural Computation
, 2001
"... For Bayesian inference on the mixture of factor analyzers (MFA), natural conjugate priors on the parameters are introduced and then a Gibbs sampler that generates parameter samples following the posterior is constructed. In addition, a deterministic estimation algorithm is derived by taking modes in ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
For Bayesian inference on the mixture of factor analyzers (MFA), natural conjugate priors on the parameters are introduced and then a Gibbs sampler that generates parameter samples following the posterior is constructed. In addition, a deterministic estimation algorithm is derived by taking modes instead of samples from the conditional posteriors used in the Gibbs sampler. This is regarded as a maximum a posteriori (MAP) estimation algorithm with hyperparameter search. The behaviors of the Gibbs sampler and the deterministic algorithm are compared on a simulation experiment. 1
MML and Bayesianism: Similarities and Differences (Introduction to Minimum Encoding Inference  Part II)
, 1994
"... This paper continues the introduction to minimum encoding inference given by Oliver and Hand. This series of papers were written with the objective of providing an introduction to this area for statisticians. We examine the relationship between Bayesianism and Minimum Message Length (MML) inference. ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
This paper continues the introduction to minimum encoding inference given by Oliver and Hand. This series of papers were written with the objective of providing an introduction to this area for statisticians. We examine the relationship between Bayesianism and Minimum Message Length (MML) inference. We argue that MML augments Bayesian methods by providing a sound Bayesian method for point estimation which is invariant under nonlinear transformations. We explore the issues of invariance of estimators under nonlinear transformations, the role of the Fisher Information matrix in MML inference, and the apparent similarity between MML and the adoption of a Jeffreys' Prior. We then compare MML to an approximate method of Bayesian Model Class Selection. Despite apparent similarities in their expressions, the properties of the two approaches can be different.
On Estimating The Mean In Bayesian Factor Analysis
 SOCIAL SCIENCE WORKING PAPER 1096, DIVISION OF HUMANITIES AND SOCIAL SCIENCES, CALTECH, PASADENA, CA 91125
, 2000
"... In the Bayesian factor analysis model (Press & Shigemasu, 1989), the sample size was assumed to be large enough to estimate the overall population mean by the sample mean. In this paper, the procedure of estimating the population mean by the sample mean is compared to estimating it along with th ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
In the Bayesian factor analysis model (Press & Shigemasu, 1989), the sample size was assumed to be large enough to estimate the overall population mean by the sample mean. In this paper, the procedure of estimating the population mean by the sample mean is compared to estimating it along with the other parameters both by Gibbs sampling and Iterated Conditional Modes. Results show that even in small samples, the Gibbs sampling and iterated conditional modes estimates of the mean are for practical purposes iden tical to the sample mean. Thus, the population mean is adequately estimated by its sample value.
Wavelet Domain Image Separation
, 2002
"... In this paper, we consider the problem of blind signal and image separation using a sparse representation of the images in the wavelet domain. We consider the problem in a Bayesian estimation framework using the fact that the distribution of the wavelet coefficients of real world images can naturall ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
In this paper, we consider the problem of blind signal and image separation using a sparse representation of the images in the wavelet domain. We consider the problem in a Bayesian estimation framework using the fact that the distribution of the wavelet coefficients of real world images can naturally be modelled by an exponential power probability density function. The Bayesian approach which has been used with success in blind source separation gives also the possibility of including any prior information we may have on the mixing matrix elements as well as on the hyperparameters (parameters of the prior laws of the noise and the sources). In our knowledge, even the Bayesian approach has been used for blind source separation either in time and in Fourier domain, it has not yet been used in wavelet domain. We consider two cases: first the case where the wavelet coefficients are assumed to be i.i.d. and second the case where we model the correlation between the coefficients of two adjacent scales by a first order Markov chain. The estimation computations are done via a Monte Carlo Markov Chain (MCMC) procedure. Some simulations show the performances of the proposed method.