Results 1  10
of
11,875
ESTIMATING SPARSE INVERSE COVARIANCE MATRIX FOR BRAIN
, 2009
"... In this project we present a braincomputer interface (BCI) which recognizes one task from another in a timely manner. We use quadratic discriminant analysis to classify Electroencephalography (EEG) samples in an online fashion. The raw EEG samples are collected over half second intervals to estimat ..."
Abstract
 Add to MetaCart
to estimate the power spectral densities. The estimated power spectral densities are treated as individual samples by the classifier. The mean and inverse covariance matrix parameters in the classifier are updated incrementally as samples arrive spread over several training sessions. We also perform some
Sparse inverse covariance matrix estimation via linear programming
, 2010
"... This paper considers the problem of estimating a high dimensional inverse covariance matrix that can be well approximated by “sparse ” matrices. Taking advantage of the connection between multivariate linear regression and entries of the inverse covariance matrix, we propose an estimating procedure ..."
Abstract

Cited by 55 (3 self)
 Add to MetaCart
This paper considers the problem of estimating a high dimensional inverse covariance matrix that can be well approximated by “sparse ” matrices. Taking advantage of the connection between multivariate linear regression and entries of the inverse covariance matrix, we propose an estimating procedure
Sparse Inverse Covariance Matrix Estimation Using Quadratic Approximation
"... The ℓ1 regularized Gaussian maximum likelihood estimator has been shown to have strong statistical guarantees in recovering a sparse inverse covariance matrix, or alternatively the underlying graph structure of a Gaussian Markov Random Field, from very limited samples. We propose a novel algorithm f ..."
Abstract

Cited by 67 (9 self)
 Add to MetaCart
The ℓ1 regularized Gaussian maximum likelihood estimator has been shown to have strong statistical guarantees in recovering a sparse inverse covariance matrix, or alternatively the underlying graph structure of a Gaussian Markov Random Field, from very limited samples. We propose a novel algorithm
Modeling Structure in the Inverse Covariance Matrix in Gaussian Mixture Models
, 2001
"... The EMLLT model allows incorporating a varying amount of structure in the inverse covariance matrix (precision matrix) in a Gaussian mixture model. The method is an extension of the previously considered MLLT model. In MLLT the inverse covariance matrix of Gaussian mixture component j, is modeled by ..."
Abstract
 Add to MetaCart
The EMLLT model allows incorporating a varying amount of structure in the inverse covariance matrix (precision matrix) in a Gaussian mixture model. The method is an extension of the previously considered MLLT model. In MLLT the inverse covariance matrix of Gaussian mixture component j, is modeled
1 TUNING PARAMETER SELECTION FOR PENALIZED LIKELIHOOD ESTIMATION OF INVERSE COVARIANCE MATRIX
, 909
"... Abstract: In a Gaussian graphical model, the conditional independence between two variables are characterized by the corresponding zero entries in the inverse covariance matrix. Maximum likelihood method using the smoothly clipped absolute deviation (SCAD) penalty (Fan and Li, 2001) and the adaptive ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Abstract: In a Gaussian graphical model, the conditional independence between two variables are characterized by the corresponding zero entries in the inverse covariance matrix. Maximum likelihood method using the smoothly clipped absolute deviation (SCAD) penalty (Fan and Li, 2001
LinearTime Inverse Covariance Matrix Estimation in Gaussian Processes
"... The computational cost of Gaussian process regression grows cubically with respect to the number of variables due to the inversion of the covariance matrix, which is impractical for data sets with more than a few thousand nodes. Furthermore, Gaussian processes lack the ability to represent condition ..."
Abstract
 Add to MetaCart
The computational cost of Gaussian process regression grows cubically with respect to the number of variables due to the inversion of the covariance matrix, which is impractical for data sets with more than a few thousand nodes. Furthermore, Gaussian processes lack the ability to represent
Adaptive Lattice Filters for BandInverse Covariance Matrix Approximations
"... It has been known since 1981 at least, that the Levinson (and Shur) algorithm can be applied to a nonstationary process with its associated arbitrary nonToeplitz covariance matrix. However, in general case this generalized Levinson algorithm involves O(N 3 ) computations, so that there are no par ..."
Abstract
 Add to MetaCart
that there are no particular advantages over the usual methods of Choleski decomposition or of matrix inversion. Therefore, the main attention devoted to a special class of nonstationary processes with covariance matrices that have a finite "displacement rank" (or equivalently "Toeplitz distance
Regression Approaches to Small Sample Inverse Covariance Matrix Estimation for Hyperspectral Image Classification
"... Abstract — A key component in most parametric classifiers is the estimation of an inverse covariance matrix. In hyperspectral images the number of bands can be in the hundreds leading to covariance matrices having tens of thousands of elements. Lately, the use of general linear regression models in ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract — A key component in most parametric classifiers is the estimation of an inverse covariance matrix. In hyperspectral images the number of bands can be in the hundreds leading to covariance matrices having tens of thousands of elements. Lately, the use of general linear regression models
Results 1  10
of
11,875