• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 20
Next 10 →

ITERATIVE LOG THRESHOLDING

by Dmitry Malioutov, R Aravkin
"... Sparse reconstruction approaches using the re-weighted `1-penalty have been shown, both empirically and theoretically, to provide a significant improvement in recovering sparse sig-nals in comparison to the `1-relaxation. However, numeri-cal optimization of such penalties involves solving problems w ..."
Abstract - Add to MetaCart
-gorithm log-thresholding in analogy to soft thresholding for the `1-penalty. We establish convergence results, and demon-strate that log-thresholding provides more accurate sparse re-constructions compared to both soft and hard thresholding. Furthermore, the approach can be directly extended to opti

2005b) Ebayesthresh: R programs for Empirical Bayes thresholding

by Iain M. Johnstone, Bernard W. Silverman - J. Statist. Softwr
"... Suppose that a sequence of unknown parameters is observed subject to independent Gaussian noise. The EbayesThresh package in the S language implements a class of Empirical Bayes thresholding methods that can take advantage of possible sparsity in the sequence, to improve the quality of estimation. T ..."
Abstract - Cited by 18 (4 self) - Add to MetaCart
weight, or sparsity parameter, is chosen automatically by marginal maximum likelihood. If estimation is carried out using the posterior median, this is a random thresholding procedure; the estimation can also be carried out using other thresholding rules with the same threshold, and the package provides

EbayesThresh: R and S-Plus programs for Empirical Bayes thresholding

by Iain M. Johnstone, Bernard W. Silverman - J. Statist. Soft , 2005
"... This report sets out a package of R and S-PLUS routines that implement a class of Empirical Bayes thresholding methods. The prior considered for each parameter in a sequence is a mixture of an atom of probability at zero and a heavy-tailed density. The package allows for the heavy-tailed density to ..."
Abstract - Cited by 10 (1 self) - Add to MetaCart
to be either a Laplace (double exponential) density or else a mixture of normal distributions with tail behavior similar to that of the Cauchy distribution. The mixing weight, or sparsity parameter, is chosen by marginal maximum likelihood. In the case of the Laplace density, the scale parameter may also

Direct convex relaxations of sparse svm

by Antoni B. Chan, Nuno Vasconcelos, Gert R. G. Lanckriet - in ICML ’07: Proceedings of the 24th international conference on Machine learning
"... Although support vector machines (SVMs) for binary classification give rise to a decision rule that only relies on a subset of the training data points (support vectors), it will in general be based on all available features in the input space. We propose two direct, novel convex relaxations of a no ..."
Abstract - Cited by 26 (0 self) - Add to MetaCart
as applying an adaptive soft-threshold on the SVM hyperplane, while the SDP formulation learns a weighted inner-product (i.e. a kernel) that results in a sparse hyperplane. Experimental results show an increase in sparsity while conserving the generalization performance compared to a standard as well as a

A Novel Weighted Total Difference Based Image Reconstruction Algorithm for Few-View Computed Tomography

by Wei Yu, Li Zeng
"... In practical applications of computed tomography (CT) imaging, due to the risk of high radiation dose imposed on the patients, it is desired that high quality CT images can be accurately reconstructed from limited projection data. While with limited projections, the images reconstructed often suffer ..."
Abstract - Add to MetaCart
, while the conventional total difference (TD) measure simply enforces the gradient sparsity horizontally and vertically. To solve our WTD-based few-view CT reconstruction model, we use the soft-threshold filtering approach. Numerical experiments are performed to validate the efficiency

Jointly Sparse Global SIMPLS Regression

by Tzu-yu Liu, Laura Trinchera, Arthur Tenenhaus, Dennis Wei, Alfred O. Hero
"... Abstract: Partial least squares (PLS) regression combines dimensionality reduction and prediction using a latent variable model. Since partial least squares regression (PLS-R) does not require matrix inversion or diagonal-ization, it can be applied to problems with large numbers of variables. As pre ..."
Abstract - Add to MetaCart
norm sparsity penalty is the `1 norm of the `2 norm on the weights corresponding to the same variable used over all the PLS components. A novel augmented Lagrangian method is proposed to solve the optimization problem and soft thresholding for sparsity occurs naturally as part of the iterative solution

EbayesThresh: R Programs for Empirical Bayes

by Iain M. Johnstone, Bernard W. Silverman
"... Suppose that a sequence of unknown parameters is observed subject to independent Gaussian noise. The EbayesThresh package in the S language implements a class of Empirical Bayes thresholding methods that can take advantage of possible sparsity in the sequence, to improve the quality of estimation. T ..."
Abstract - Add to MetaCart
weight, or sparsity parameter, is chosen automatically by marginal maximum likelihood. If estimation is carried out using the posterior median, this is a random thresholding procedure; the estimation can also be carried out using other thresholding rules with the same threshold, and the package provides

Radon Series Comp. Appl. Math xx, 1–110 c ○ de Gruyter 2010 Numerical Methods for Sparse Recovery

by Massimo Fornasier
"... Abstract. These lecture notes address the analysis of numerical methods for performing optimizations with linear model constraints and additional sparsity conditions to solutions, i.e., we expect solutions which can be represented as sparse vectors with respect to a prescribed basis. In the first pa ..."
Abstract - Add to MetaCart
part of the manuscript we illustrate the theory of compressed sensing with emphasis on computational aspects. We present the analysis of the homotopy method, the iteratively re-weighted least squares method, and the iterative hard-thresholding. In the second part, starting from the analysis

1Compressive Link Acquisition in Multiuser Communications

by Xiao Li, Andrea Rueetschi, Anna Scaglione, Yonina C. Eldar
"... Abstract—An important receiver operation is to detect the presence specific preamble signals with unknown delays in the presence of scattering, Doppler effects and carrier offsets. This task, referred to as “link acquisition”, is typically a sequential search over the transmitted signal space. Recen ..."
Abstract - Add to MetaCart
-off in complexity and performance that is possible when using sparse recovery. To do so, we propose a sequential sparsity-aware compressive sampling (C-SA) acquisition scheme, where a compressive multi-channel sampling (CMS) front-end is followed by a sparsity regularized likelihood ratio test (SR-LRT) module

RESEARCH ARTICLE Balanced Sparse Model for Tight Frames in Compressed Sensing Magnetic Resonance Imaging

by Yunsong Liu, Jian-feng Cai, Zhifang Zhan, Di Guo, Jing Ye, Zhong Chen, Xiaobo Qu
"... Compressed sensing has shown to be promising to accelerate magnetic resonance imag-ing. In this new technology, magnetic resonance images are usually reconstructed by en-forcing its sparsity in sparse image reconstruction models, including both synthesis and analysis models. The synthesis model assu ..."
Abstract - Add to MetaCart
Compressed sensing has shown to be promising to accelerate magnetic resonance imag-ing. In this new technology, magnetic resonance images are usually reconstructed by en-forcing its sparsity in sparse image reconstruction models, including both synthesis and analysis models. The synthesis model
Next 10 →
Results 1 - 10 of 20
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University