Results 1  10
of
1,017
Clayton copula and mixture decomposition
 In ASMDA 2005
, 2005
"... Abstract. A symbolic variable is often described by a histogram. More generally, it can be provided in the form of a continuous distribution. In this case, the problem is to solve the most frequent problem in data mining, namely: to classify the objects starting from the description of the variables ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract. A symbolic variable is often described by a histogram. More generally, it can be provided in the form of a continuous distribution. In this case, the problem is to solve the most frequent problem in data mining, namely: to classify the objects starting from the description of the variables in the form of continuous distributions. A solution is to sample each distribution in a number N of points, and to evaluate the joint distribution of these values using the copulas, and also to adapt the dynamical clustering (nuées dynamiques) method to these joint densities. In this paper we compare the Clayton copula and the Normal copula for more than 2 dimensions, and we compare results of clustering by using on the one hand the method based on the Clayton copula and traditional methods (MCLUST, and Kmeans). Our comparison is based on 2 wellknown classical data files.
Probabilistic Latent Semantic Analysis
 In Proc. of Uncertainty in Artificial Intelligence, UAI’99
, 1999
"... Probabilistic Latent Semantic Analysis is a novel statistical technique for the analysis of twomode and cooccurrence data, which has applications in information retrieval and filtering, natural language processing, machine learning from text, and in related areas. Compared to standard Latent Sema ..."
Abstract

Cited by 771 (9 self)
 Add to MetaCart
Semantic Analysis which stems from linear algebra and performs a Singular Value Decomposition of cooccurrence tables, the proposed method is based on a mixture decomposition derived from a latent class model. This results in a more principled approach which has a solid foundation in statistics. In order
Visual Comparison of Datasets using Mixture Decompositions
"... We describe how a mixture of two densities f 0 and f 1 may be decomposed into a different mixture consisting of three densities. These new densities, f + , f \Gamma , and f= , summarize differences between f 0 and f 1 : f + is high in areas of excess of f 1 compared to f 0 ; f \Gamma represents defi ..."
Abstract
 Add to MetaCart
deficiency of f 1 compared to f 0 in the same way; f= represents commonality between f 1 and f 0 . The supports of f+ and f \Gamma are disjoint. This decomposition of the mixture of f 0 and f 1 is similar to the settheoretic decomposition of the union of two sets A and B into the disjoint sets AnB, Bn
Visual Comparison of Datasets Using Mixture Decompositions
"... This article describes how a mixture of two densities, f0 and f1, may be decomposed into a different mixture consisting of three densities. These new densities, f+, f−, and f=, summarize differences between f0 and f1: f+ is high in areas of excess of f1 compared to f0; f − represents deficiency of f ..."
Abstract
 Add to MetaCart
of f1 compared to f0 in the same way; f = represents commonality between f1 and f0. The supports of f+ and f − are disjoint. This decomposition of the mixture of f0 and f1 is similar to the settheoretic decomposition of the union of two sets A and B into the disjoint sets A\B, B\A, and A ∩ B. Sample
Spectral Mixture Decomposition by Least Dependent Component Analysis
, 2005
"... A recently proposed mutual information based algorithm for decomposing data into least dependent components (MILCA) is applied to spectral analysis, namely to blind recovery of concentrations and pure spectra from their linear mixtures. The algorithm is based on precise estimates of mutual informati ..."
Abstract
 Add to MetaCart
. In combination with second derivative preprocessing and alternating least squares postprocessing, MILCA shows decomposition performance comparable with or superior to specialized chemometrics algorithms. The results are illustrated on a number of simulated and experimental (infrared and Raman) mixture problems
Unsupervised Learning by Probabilistic Latent Semantic Analysis
 Machine Learning
, 2001
"... Abstract. This paper presents a novel statistical method for factor analysis of binary and count data which is closely related to a technique known as Latent Semantic Analysis. In contrast to the latter method which stems from linear algebra and performs a Singular Value Decomposition of cooccurren ..."
Abstract

Cited by 618 (4 self)
 Add to MetaCart
occurrence tables, the proposed technique uses a generative latent class model to perform a probabilistic mixture decomposition. This results in a more principled approach with a solid foundation in statistical inference. More precisely, we propose to make use of a temperature controlled version of the Expectation
Categorization of Image Databases for Efficient Retrieval Using Robust Mixture Decomposition
, 1998
"... In this paper, we present a robust mixture decomposition technique that automatically finds a compact representation of the data in terms of categories. We apply it to the problem of organizing databases for efficient retrieval. The time taken for retrieval is shown to be an order of magnitude small ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
In this paper, we present a robust mixture decomposition technique that automatically finds a compact representation of the data in terms of categories. We apply it to the problem of organizing databases for efficient retrieval. The time taken for retrieval is shown to be an order of magnitude
Probabilistic Visual Learning for Object Representation
, 1996
"... We present an unsupervised technique for visual learning which is based on density estimation in highdimensional spaces using an eigenspace decomposition. Two types of density estimates are derived for modeling the training data: a multivariate Gaussian (for unimodal distributions) and a Mixtureof ..."
Abstract

Cited by 699 (15 self)
 Add to MetaCart
We present an unsupervised technique for visual learning which is based on density estimation in highdimensional spaces using an eigenspace decomposition. Two types of density estimates are derived for modeling the training data: a multivariate Gaussian (for unimodal distributions) and a Mixture
Fast and robust fixedpoint algorithms for independent component analysis
 IEEE TRANS. NEURAL NETW
, 1999
"... Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, we use a combination of two different approaches for linear ICA: Comon’s informat ..."
Abstract

Cited by 884 (34 self)
 Add to MetaCart
informationtheoretic approach and the projection pursuit approach. Using maximum entropy approximations of differential entropy, we introduce a family of new contrast (objective) functions for ICA. These contrast functions enable both the estimation of the whole decomposition by minimizing mutual information
Mixture Decomposition of Distributions using a Decomposition of the Sample Space
, 2010
"... We consider the set of join probability distributions of N binary random variables which can be written as a sum of m distributions in the following form p(x1,..., xN) =∑m i=1 αifi(x1,..., xN), where αi ≥ 0, ∑m i=1 αi = 1, and the fi(x1,..., xN) belong to some exponential family. For our analysis we ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
we decompose the sample space into portions on which the mixture components fi can be chosen arbitrarily. We derive lower bounds on the number of mixture components from a given exponential family necessary to represent distributions with arbitrary correlations up to a certain order or to represent
Results 1  10
of
1,017