Results 1  10
of
88
A Unifying Review of Linear Gaussian Models
, 1999
"... Factor analysis, principal component analysis, mixtures of gaussian clusters, vector quantization, Kalman filter models, and hidden Markov models can all be unified as variations of unsupervised learning under a single basic generative model. This is achieved by collecting together disparate observa ..."
Abstract

Cited by 348 (18 self)
 Add to MetaCart
(Show Context)
Factor analysis, principal component analysis, mixtures of gaussian clusters, vector quantization, Kalman filter models, and hidden Markov models can all be unified as variations of unsupervised learning under a single basic generative model. This is achieved by collecting together disparate observations and derivations made by many previous authors and introducing a new way of linking discrete and continuous state models using a simple nonlinearity. Through the use of other nonlinearities, we show how independent component analysis is also a variation of the same basic generative model. We show that factor analysis and mixtures of gaussians can be implemented in autoencoder neural networks and learned using squared error plus the same regularization term. We introduce a new model for static data, known as sensible principal component analysis, as well as a novel concept of spatially adaptive observation noise. We also review some of the literature involving global and local mixtures of the basic models and provide pseudocode for inference and learning for all the basic models.
Bayesian Model Assessment In Factor Analysis
, 2004
"... Factor analysis has been one of the most powerful and flexible tools for assessment of multivariate dependence and codependence. Loosely speaking, it could be argued that the origin of its success rests in its very exploratory nature, where various kinds of datarelationships amongst the variable ..."
Abstract

Cited by 103 (10 self)
 Add to MetaCart
Factor analysis has been one of the most powerful and flexible tools for assessment of multivariate dependence and codependence. Loosely speaking, it could be argued that the origin of its success rests in its very exploratory nature, where various kinds of datarelationships amongst the variables at study can be iteratively verified and/or refuted. Bayesian inference in factor analytic models has received renewed attention in recent years, partly due to computational advances but also partly to applied focuses generating factor structures as exemplified by recent work in financial time series modeling. The focus of our current work is on exploring questions of uncertainty about the number of latent factors in a multivariate factor model, combined with methodological and computational issues of model specification and model fitting. We explore reversible jump MCMC methods that build on sets of parallel Gibbs samplingbased analyses to generate suitable empirical proposal distributions and that address the challenging problem of finding e#cient proposals in highdimensional models. Alternative MCMC methods based on bridge sampling are discussed, and these fully Bayesian MCMC approaches are compared with a collection of popular model selection methods in empirical studies.
Recent developments in the factor analysis of categorical variables
 Journal of Educational Statistics
, 1986
"... ABSTRACT. Despite known shortcomings of the procedure, exploratory factor analysis of dichotomous test items has been limited, until recently, to unweighted analyses of matrices of tetrachoric correlations. Superior methods have begun to appear in the literature, in professional symposia, and in com ..."
Abstract

Cited by 55 (0 self)
 Add to MetaCart
ABSTRACT. Despite known shortcomings of the procedure, exploratory factor analysis of dichotomous test items has been limited, until recently, to unweighted analyses of matrices of tetrachoric correlations. Superior methods have begun to appear in the literature, in professional symposia, and in computer programs. This paper places these developments in a unified framework, from a review of the classical common factor model for measured variables through generalized least squares and marginal maximum likelihood solutions for dichotomous data. Further extensions of the model are also reported as work in progress. Under classical Thurstonian factor analysis (Thurstone, 1947), values of p measured variables are modeled as linear functions of some smaller number of m continuous latent variables, the "factors " that account for the correlations among the observed variables. The usual objectives in factor analysis are (a) to determine the number of factors that provide a satisfactory fit to the observed correlation matrix and (b) to estimate the regression coefficients of the observed variables on the factors—all this, it is hoped, leading to a more
International stock return comovements
 Journal of Finance
, 2009
"... We examine international stock return comovements using countryindustry and countrystyle portfolios as the base portfolios. We first establish that parsimonious riskbased factor models capture the covariance structure of the data better than the popular HestonRouwenhorst (1994) model. We then es ..."
Abstract

Cited by 40 (5 self)
 Add to MetaCart
We examine international stock return comovements using countryindustry and countrystyle portfolios as the base portfolios. We first establish that parsimonious riskbased factor models capture the covariance structure of the data better than the popular HestonRouwenhorst (1994) model. We then establish the following stylized facts regarding stock return comovements. First, we do not find evidence for an upward trend in return correlations, except for the European stock markets. Second, the increasing importance of industry factors relative to country factors was a shortlived, temporary phenomenon. Third, we find that large growth stocks are more correlated across countries than are small value stocks, and that the difference has increased over time. JEL Classification: C52, G11, G12.
Fitting vast dimensional timevarying covariance models, Oxford Financial Research Centre, Financial Economics Working Paper n
, 2008
"... Building models for high dimensional portfolios is important in risk management and asset allocation. Here we propose a novel and fast way of estimating models of timevarying covariances that overcome an undiagnosed incidental parameter problem which has troubled existing methods when applied to hu ..."
Abstract

Cited by 37 (5 self)
 Add to MetaCart
Building models for high dimensional portfolios is important in risk management and asset allocation. Here we propose a novel and fast way of estimating models of timevarying covariances that overcome an undiagnosed incidental parameter problem which has troubled existing methods when applied to hundreds or even thousands of assets. Indeed we can handle the case where the crosssectional dimension is larger than the time series one. The theory of this new strategy is developed in some detail, allowing formal hypothesis testing to be carried out on these models. Simulations are used to explore the performance of this inference strategy while empirical examples are reported which show the strength of this method. The out of sample hedging performance of various models estimated using this method are compared.
Continuous latent variable models for dimensionality reduction and sequential data reconstruction
, 2001
"... ..."
Extracting factors from heteroskedastic asset return
 Journal of Financial Economics
, 2001
"... This paper proposes an alternative to the asymptotic principal components procedure of Connor and Korajczyk (J. Financial Econom. 15 (1986) 373) that is robust to time series heteroskedasticity in the factor model residuals. The new method is simple to use and requires no assumptions stronger than t ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
This paper proposes an alternative to the asymptotic principal components procedure of Connor and Korajczyk (J. Financial Econom. 15 (1986) 373) that is robust to time series heteroskedasticity in the factor model residuals. The new method is simple to use and requires no assumptions stronger than those made by Connor and Korajczyk. It is demonstrated through simulations and analysis of actual stock market data that allowing heteroskedasticity sometimes improves the quality of the extracted factors quite dramatically. Over the period from 1989 to 1993, for example, a single factor extracted using the Connor and Korajczyk method explains only 8.2 % of the variation of the CRSP valueweighted index, while the factor extracted allowing heteroskedasticity explains 57.3%. Accounting for heteroskedasticity is also important for tests of the APT, with pvalues sometimes depending strongly on the factor extraction method
Factor analysis using deltarule wakesleep learning
 Neural Computation
, 1997
"... We describe a linear network that models correlations between realvalued visible variables using one or more realvalued hidden variables — a factor analysis model. This model can be seen as a linear version of the “Helmholtz machine”, and its parameters can be learned using the “wakesleep ” metho ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
(Show Context)
We describe a linear network that models correlations between realvalued visible variables using one or more realvalued hidden variables — a factor analysis model. This model can be seen as a linear version of the “Helmholtz machine”, and its parameters can be learned using the “wakesleep ” method, in which learning of the primary “generative” model is assisted by a “recognition ” model, whose role is to fill in the values of hidden variables based on the values of visible variables. The generative and recognition models are jointly learned in “wake ” and “sleep ” phases, using just the delta rule. This learning procedure is comparable in simplicity to Oja’s version of Hebbian learning, which produces a somewhat different representation of correlations in terms of principal components. We argue that the simplicity of wakesleep learning makes factor analysis a plausible alternative to Hebbian learning as a model of activitydependent cortical plasticity. 1
2007), “Panel Data Models with Multiple TimeVarying Individual Effects
 Journal of Productivity Analysis
"... This paper considers a panel data model with timevarying individual effects. The data are assumed to contain a large number of crosssectional units repeatedly observed over a fixed number of time periods. The model has a feature of the fixedeffects model in that the effects are assumed to be corr ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
This paper considers a panel data model with timevarying individual effects. The data are assumed to contain a large number of crosssectional units repeatedly observed over a fixed number of time periods. The model has a feature of the fixedeffects model in that the effects are assumed to be correlated with the regressors. The unobservable individual effects are assumed to have a factor structure. For consistent estimation of the model, it is important to estimate the true number of factors. We propose a generalized methods of moments procedure by which both the number of factors and the regression coefficients can be consistently estimated. Some important identification issues are also discussed. Our simulation results indicate that the proposed methods produce reliable estimates.
Analysis of covariance structures under elliptical distributions
 Journal of the American Statistical Association
, 1987
"... This article examines the adjustment of normal theory methods for the analysis of covariance structures to make them applicable under the class of elliptical distributions. It is shown that if the model satisfies a mild scale invariance condition and the data have an elliptical distribution, the asy ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
(Show Context)
This article examines the adjustment of normal theory methods for the analysis of covariance structures to make them applicable under the class of elliptical distributions. It is shown that if the model satisfies a mild scale invariance condition and the data have an elliptical distribution, the asymptotic covariance matrix of sample covariances has a structure that results in the retention of many of the asymptotic properties of normal theory methods. If a scale adjustment is applied, the likelihood ratio tests of fit have the usual asymptotic chisquared distributions. Difference tests retain their property of asymptotic independence, and maximum likelihood estimators retain their relative asymptotic efficiency within the class of estimators based on the sample covariance matrix. An adjustment to the asymptotic covariance matrix of normal theory maximum likelihood estimators for elliptical distributions is provided. This adjustment is particularly simple in models for patterned covariance or correlation matrices. These results apply not only to normal theory maximum likelihood methods but also to a class of minimum discrepancy methods. Similar results also apply when certain robust estimators of the covariance matrix are employed.