Results 1  10
of
44,620
KSVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation
, 2006
"... In recent years there has been a growing interest in the study of sparse representation of signals. Using an overcomplete dictionary that contains prototype signalatoms, signals are described by sparse linear combinations of these atoms. Applications that use sparse representation are many and inc ..."
Abstract

Cited by 930 (41 self)
 Add to MetaCart
In recent years there has been a growing interest in the study of sparse representation of signals. Using an overcomplete dictionary that contains prototype signalatoms, signals are described by sparse linear combinations of these atoms. Applications that use sparse representation are many
Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ¹ minimization
 PROC. NATL ACAD. SCI. USA 100 2197–202
, 2002
"... Given a ‘dictionary’ D = {dk} of vectors dk, we seek to represent a signal S as a linear combination S = ∑ k γ(k)dk, with scalar coefficients γ(k). In particular, we aim for the sparsest representation possible. In general, this requires a combinatorial optimization process. Previous work considered ..."
Abstract

Cited by 626 (37 self)
 Add to MetaCart
Given a ‘dictionary’ D = {dk} of vectors dk, we seek to represent a signal S as a linear combination S = ∑ k γ(k)dk, with scalar coefficients γ(k). In particular, we aim for the sparsest representation possible. In general, this requires a combinatorial optimization process. Previous work
Greed is Good: Algorithmic Results for Sparse Approximation
, 2004
"... This article presents new results on using a greedy algorithm, orthogonal matching pursuit (OMP), to solve the sparse approximation problem over redundant dictionaries. It provides a sufficient condition under which both OMP and Donoho’s basis pursuit (BP) paradigm can recover the optimal representa ..."
Abstract

Cited by 916 (8 self)
 Add to MetaCart
representation of an exactly sparse signal. It leverages this theory to show that both OMP and BP succeed for every sparse input signal from a wide class of dictionaries. These quasiincoherent dictionaries offer a natural generalization of incoherent dictionaries, and the cumulative coherence function
Linear spatial pyramid matching using sparse coding for image classification
 in IEEE Conference on Computer Vision and Pattern Recognition(CVPR
, 2009
"... Recently SVMs using spatial pyramid matching (SPM) kernel have been highly successful in image classification. Despite its popularity, these nonlinear SVMs have a complexity O(n 2 ∼ n 3) in training and O(n) in testing, where n is the training size, implying that it is nontrivial to scaleup the algo ..."
Abstract

Cited by 488 (19 self)
 Add to MetaCart
the algorithms to handle more than thousands of training images. In this paper we develop an extension of the SPM method, by generalizing vector quantization to sparse coding followed by multiscale spatial max pooling, and propose a linear SPM kernel based on SIFT sparse codes. This new approach remarkably
Just Relax: Convex Programming Methods for Identifying Sparse Signals in Noise
, 2006
"... This paper studies a difficult and fundamental problem that arises throughout electrical engineering, applied mathematics, and statistics. Suppose that one forms a short linear combination of elementary signals drawn from a large, fixed collection. Given an observation of the linear combination that ..."
Abstract

Cited by 496 (2 self)
 Add to MetaCart
. This paper studies a method called convex relaxation, which attempts to recover the ideal sparse signal by solving a convex program. This approach is powerful because the optimization can be completed in polynomial time with standard scientific software. The paper provides general conditions which ensure
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
, 2007
"... A fullrank matrix A ∈ IR n×m with n < m generates an underdetermined system of linear equations Ax = b having infinitely many solutions. Suppose we seek the sparsest solution, i.e., the one with the fewest nonzero entries: can it ever be unique? If so, when? As optimization of sparsity is combin ..."
Abstract

Cited by 423 (37 self)
 Add to MetaCart
is combinatorial in nature, are there efficient methods for finding the sparsest solution? These questions have been answered positively and constructively in recent years, exposing a wide variety of surprising phenomena; in particular, the existence of easilyverifiable conditions under which optimallysparse
Inductive Learning Algorithms and Representations for Text Categorization
, 1998
"... Text categorization – the assignment of natural language texts to one or more predefined categories based on their content – is an important component in many information organization and management tasks. We compare the effectiveness of five different automatic learning algorithms for text categori ..."
Abstract

Cited by 641 (8 self)
 Add to MetaCart
categorization in terms of learning speed, realtime classification speed, and classification accuracy. We also examine training set size, and alternative document representations. Very accurate text classifiers can be learned automatically from training examples. Linear Support Vector Machines (SVMs
Uncertainty principles and ideal atomic decomposition
 IEEE Transactions on Information Theory
, 2001
"... Suppose a discretetime signal S(t), 0 t<N, is a superposition of atoms taken from a combined time/frequency dictionary made of spike sequences 1ft = g and sinusoids expf2 iwt=N) = p N. Can one recover, from knowledge of S alone, the precise collection of atoms going to make up S? Because every d ..."
Abstract

Cited by 588 (19 self)
 Add to MetaCart
/frequency dictionary, then there is only one such highly sparse representation of S, and it can be obtained by solving the convex optimization problem of minimizing the `1 norm of the coe cients among all decompositions. Here \highly sparse " means that Nt + Nw < p N=2 where Nt is the number of time atoms, Nw
A solution to Plato’s problem: The latent semantic analysis theory of acquisition, induction, and representation of knowledge
 PSYCHOLOGICAL REVIEW
, 1997
"... How do people know as much as they do with as little information as they get? The problem takes many forms; learning vocabulary from text is an especially dramatic and convenient case for research. A new general theory of acquired similarity and knowledge representation, latent semantic analysis (LS ..."
Abstract

Cited by 1772 (10 self)
 Add to MetaCart
How do people know as much as they do with as little information as they get? The problem takes many forms; learning vocabulary from text is an especially dramatic and convenient case for research. A new general theory of acquired similarity and knowledge representation, latent semantic analysis
Image denoising by sparse 3D transformdomain collaborative filtering
 IEEE TRANS. IMAGE PROCESS
, 2007
"... We propose a novel image denoising strategy based on an enhanced sparse representation in transform domain. The enhancement of the sparsity is achieved by grouping similar 2D image fragments (e.g., blocks) into 3D data arrays which we call “groups.” Collaborative filtering is a special procedure d ..."
Abstract

Cited by 422 (32 self)
 Add to MetaCart
We propose a novel image denoising strategy based on an enhanced sparse representation in transform domain. The enhancement of the sparsity is achieved by grouping similar 2D image fragments (e.g., blocks) into 3D data arrays which we call “groups.” Collaborative filtering is a special procedure
Results 1  10
of
44,620