Results 1  10
of
577
Necessary and sufficient conditions on sparsity pattern recovery
, 2009
"... The paper considers the problem of detecting the sparsity pattern of a ksparse vector in R n from m random noisy measurements. A new necessary condition on the number of measurements for asymptotically reliable detection with maximum likelihood (ML) estimation and Gaussian measurement matrices is ..."
Abstract

Cited by 106 (12 self)
 Add to MetaCart
The paper considers the problem of detecting the sparsity pattern of a ksparse vector in R n from m random noisy measurements. A new necessary condition on the number of measurements for asymptotically reliable detection with maximum likelihood (ML) estimation and Gaussian measurement matrices
Incomplete Cholesky Factorization with Sparsity Pattern Modification
, 1993
"... This paper proposes, analyzes, and numerically tests methods to assure the existence of incomplete Cholesky (IC) factorization preconditioners, based solely on the target sparsity pattern for the triangular factor R. If the sparsity pattern has a simple property (called property C+), then the IC fac ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper proposes, analyzes, and numerically tests methods to assure the existence of incomplete Cholesky (IC) factorization preconditioners, based solely on the target sparsity pattern for the triangular factor R. If the sparsity pattern has a simple property (called property C+), then the IC
Detecting Jacobian Sparsity Patterns by Bayesian Probing
 Math. Prog
, 2000
"... Many numerical methods require the evaluation of Jacobians for vector functions given as evaluation procedures. If known, the sparsity pattern of these first derivative matrices can be used to evaluate, store, and manipulate them more efficiently. Especially on discretizations of differential eq ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Many numerical methods require the evaluation of Jacobians for vector functions given as evaluation procedures. If known, the sparsity pattern of these first derivative matrices can be used to evaluate, store, and manipulate them more efficiently. Especially on discretizations of differential
FACTORING MATRICES WITH A TREESTRUCTURED SPARSITY PATTERN
"... Abstract. Let A be a matrix whose sparsity pattern is a tree with maximal degree dmax. We show that if the columns of A are ordered using minimum degree on A + A ∗ , then factoring A using a sparse LU with partial pivoting algorithm generates only O(dmaxn) fill, requires only O(dmaxn) operations, an ..."
Abstract
 Add to MetaCart
Abstract. Let A be a matrix whose sparsity pattern is a tree with maximal degree dmax. We show that if the columns of A are ordered using minimum degree on A + A ∗ , then factoring A using a sparse LU with partial pivoting algorithm generates only O(dmaxn) fill, requires only O(dmaxn) operations
A Priori Sparsity Patterns For Parallel Sparse Approximate Inverse Preconditioners
, 1998
"... . Parallel algorithms for computing sparse approximations to the inverse of a sparse matrix either use a prescribed sparsity pattern for the approximate inverse, or attempt to generate a good pattern as part of the algorithm. This paper demonstrates that for PDE problems, the patterns of powers of s ..."
Abstract

Cited by 69 (6 self)
 Add to MetaCart
. Parallel algorithms for computing sparse approximations to the inverse of a sparse matrix either use a prescribed sparsity pattern for the approximate inverse, or attempt to generate a good pattern as part of the algorithm. This paper demonstrates that for PDE problems, the patterns of powers
Fast Numerical Determination of Symmetric Sparsity Patterns
, 1992
"... We consider a function g : ! n ! ! n for which the Jacobian is symmetric and sparse. Such functions often arise, for instance, in numerical optimization, where g is the gradient of some objective function f so that the Jacobian of g is the Hessian of f . In many such applications one can generat ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
generate extremely efficient algorithms by taking advantage of the sparsity structure of the problem if this pattern is known a priori. Unfortunately, determining such sparsity structures by hand is often difficult and prone to error. If one suspects a mistake has been made, or if g is a "black box
Convex approaches to model wavelet sparsity patterns
 in International Conference on Image Processing (ICIP
, 2011
"... Statistical dependencies among wavelet coefficients are commonly represented by graphical models such as hidden Markov trees (HMTs). However, in linear inverse problems such as deconvolution, tomography, and compressed sensing, the presence of a sensing or observation matrix produces a linear mixing ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
mixing of the simple Markovian dependency structure. This leads to reconstruction problems that are nonconvex optimizations. Past work has dealt with this issue by resorting to greedy or suboptimal iterative reconstruction methods. In this paper, we propose new modeling approaches based on groupsparsity
Nearly Sharp Sufficient Conditions on Exact Sparsity Pattern Recovery
"... Abstract—Consider the ndimensional vector y = X + where 2 p has only k nonzero entries and 2 n is a Gaussian noise. This can be viewed as a linear system with sparsity constraints corrupted by noise, where the objective is to estimate the sparsity pattern of given the observation vector y and the m ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Abstract—Consider the ndimensional vector y = X + where 2 p has only k nonzero entries and 2 n is a Gaussian noise. This can be viewed as a linear system with sparsity constraints corrupted by noise, where the objective is to estimate the sparsity pattern of given the observation vector y
Results 1  10
of
577