Results 1  10
of
117
Independent Component Analysis
 Neural Computing Surveys
, 2001
"... A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the ..."
Abstract

Cited by 1492 (93 self)
 Add to MetaCart
A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the original data. Wellknown linear transformation methods include, for example, principal component analysis, factor analysis, and projection pursuit. A recently developed linear transformation method is independent component analysis (ICA), in which the desired representation is the one that minimizes the statistical dependence of the components of the representation. Such a representation seems to capture the essential structure of the data in many applications. In this paper, we survey the existing theory and methods for ICA. 1
Fast and robust fixedpoint algorithms for independent component analysis
 IEEE TRANS. NEURAL NETW
, 1999
"... Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, we use a combination of two different approaches for linear ICA: Comon’s informat ..."
Abstract

Cited by 511 (34 self)
 Add to MetaCart
Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, we use a combination of two different approaches for linear ICA: Comon’s informationtheoretic approach and the projection pursuit approach. Using maximum entropy approximations of differential entropy, we introduce a family of new contrast (objective) functions for ICA. These contrast functions enable both the estimation of the whole decomposition by minimizing mutual information, and estimation of individual independent components as projection pursuit directions. The statistical properties of the estimators based on such contrast functions are analyzed under the assumption of the linear mixture model, and it is shown how to choose contrast functions that are robust and/or of minimum variance. Finally, we introduce simple fixedpoint algorithms for practical optimization of the contrast functions. These algorithms optimize the contrast functions very fast and reliably.
Objective Function Formulation of the BCM Theory of Visual Cortical Plasticity: Statistical Connections, Stability Conditions
 NEURAL NETWORKS
, 1992
"... In this paper, we present an objective function formulation of the BCM theory of visual cortical plasticity that permits us to demonstrate the connection between the unsupervised BCM learning procedure and various statistical methods, in particular, that of Projection Pursuit. This formulation provi ..."
Abstract

Cited by 86 (37 self)
 Add to MetaCart
In this paper, we present an objective function formulation of the BCM theory of visual cortical plasticity that permits us to demonstrate the connection between the unsupervised BCM learning procedure and various statistical methods, in particular, that of Projection Pursuit. This formulation provides a general method for stability analysis of the fixed points of the theory and enables us to analyze the behavior and the evolution of the network under various visual rearing conditions. It also allows comparison with many existing unsupervised methods. This model has been shown successful in various applications such as phoneme and 3D object recognition. We thus have the striking and possibly highly significant result that a biological neuron is performing a sophisticated statistical procedure.
Inverse eigenvalue problems
 SIAM Rev
, 1998
"... Abstract. A collection of inverse eigenvalue problems are identi ed and classi ed according to their characteristics. Current developments in both the theoretic and the algorithmic aspects are summarized and reviewed in this paper. This exposition also reveals many open questions that deserves furth ..."
Abstract

Cited by 41 (6 self)
 Add to MetaCart
Abstract. A collection of inverse eigenvalue problems are identi ed and classi ed according to their characteristics. Current developments in both the theoretic and the algorithmic aspects are summarized and reviewed in this paper. This exposition also reveals many open questions that deserves further study. An extensive bibliography of pertinent literature is attached.
Algebraic factor analysis: tetrads, pentads and beyond
"... Factor analysis refers to a statistical model in which observed variables are conditionally independent given fewer hidden variables, known as factors, and all the random variables follow a multivariate normal distribution. The parameter space of a factor analysis model is a subset of the cone of po ..."
Abstract

Cited by 28 (12 self)
 Add to MetaCart
Factor analysis refers to a statistical model in which observed variables are conditionally independent given fewer hidden variables, known as factors, and all the random variables follow a multivariate normal distribution. The parameter space of a factor analysis model is a subset of the cone of positive definite matrices. This parameter space is studied from the perspective of computational algebraic geometry. Gröbner bases and resultants are applied to compute the ideal of all polynomial functions that vanish on the parameter space. These polynomials, known as model invariants, arise from rank conditions on a symmetric matrix under elimination of the diagonal entries of the matrix. Besides revealing the geometry of the factor analysis model, the model invariants also furnish useful statistics for testing goodnessoffit. 1
Dimensional Model Reduction in Nonlinear Finite Element Dynamics of Solids and Structures
 International Journal for Numerical Methods in Engineering
"... this paper given as wallclock measurements on an 225 MHz SGI Octane workstation with R10K processor, 128 MB of memory, and 32 KB instruction and data caches, and 1 MB secondary cache. Copyright c ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
this paper given as wallclock measurements on an 225 MHz SGI Octane workstation with R10K processor, 128 MB of memory, and 32 KB instruction and data caches, and 1 MB secondary cache. Copyright c
H.: Causal discovery via MML
 In: Proceedings of the Thirteenth International Conference on Machine Learning
, 1996
"... Automating the learning of causal models from sample data is a key step toward incorporating machine learning into decisionmaking and reasoning under uncertainty. This paper presents a Bayesian approach to the discovery of causal models, using a Minimum Message Length (MML) method. We have developed ..."
Abstract

Cited by 20 (10 self)
 Add to MetaCart
Automating the learning of causal models from sample data is a key step toward incorporating machine learning into decisionmaking and reasoning under uncertainty. This paper presents a Bayesian approach to the discovery of causal models, using a Minimum Message Length (MML) method. We have developed encoding and search methods for discovering linear causal models. The initial experimental results presented in this paper show that the MML induction approach can recover causal models from generated data which are quite accurate re ections of the original models and compare favorably with those of TETRAD II (Spirtes et al. 1994) even when it is supplied with prior temporal information and MML is not. 1
Factor Analysis Vs. Fuzzy Sets Theory: Assessing The Influence Of Different Techniques On Sen's Functioning Approach
 Center of Economic Studies Discussion Paper, KU Leuven, DPS 01.21
, 2001
"... This paper explores a couple of specific operational interpretations of Sen's approach in view of assessing the extent to which the results originated by the implementation of Sen's concepts are influenced by the choice of the specific technique. By means of a survey based on a representative sam ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
This paper explores a couple of specific operational interpretations of Sen's approach in view of assessing the extent to which the results originated by the implementation of Sen's concepts are influenced by the choice of the specific technique. By means of a survey based on a representative sample of Belgian individuals, seven achieved functionings are identified via each technique and subsequently confronted. To structure the information and to facilitate comparisons, standard multivariate analysis is performed, while at the same time considering in more detail the subgroup of the most deprived individuals. In this way, a substantial accordance  yet no perfect equivalence  is uncovered in the general patterns of functionings' achievements.
Job search and impatience
 Journal of Labor Economics
, 2005
"... Workers who are more impatient search less intensively and set lower reservation wages. The effect of impatience on exit rates from unemployment is therefore unclear. If agents have exponential time preferences, the reservation wage effect dominates for sufficiently patient individuals, so increases ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
Workers who are more impatient search less intensively and set lower reservation wages. The effect of impatience on exit rates from unemployment is therefore unclear. If agents have exponential time preferences, the reservation wage effect dominates for sufficiently patient individuals, so increases in impatience lead to higher exit rates. The opposite is true for agents with hyperbolic time preferences. Using two large longitudinal data sets, we find that impatience measures are negatively correlated with search effort and the unemployment exit rate and are orthogonal to reservation wages. Impatience substantially affects outcomes in the direction predicted by the hyperbolic model. I.