Results 1 
6 of
6
On the performance bound of sparse estimation with sensing matrix perturbation
 Signal Processing, IEEE Transactions on
, 2013
"... This paper focusses on the sparse estimation in the situation where both the the sensing matrix and the measurement vector are corrupted by additive Gaussian noises. The performance bound of sparse estimation is analyzed and discussed in depth. Two types of lower bounds, the constrained CramérRao ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
This paper focusses on the sparse estimation in the situation where both the the sensing matrix and the measurement vector are corrupted by additive Gaussian noises. The performance bound of sparse estimation is analyzed and discussed in depth. Two types of lower bounds, the constrained CramérRao bound (CCRB) and the HammersleyChapmanRobbins bound (HCRB), are discussed. It is shown that the situation with sensing matrix perturbation is more complex than the one with only measurement noise. For the CCRB, its closedform expression is deduced. It demonstrates a gap between the maximal and nonmaximal support cases. It is also revealed that a gap lies between the CCRB and the MSE of the oracle pseudoinverse estimator, but it approaches zero asymptotically when the problem dimensions tend to infinity. For a tighter bound, the HCRB, despite of the difficulty in obtaining a simple expression for general sensing matrix, a closedform expression in the unit sensing matrix case is derived for a qualitative study of the performance bound. It is shown that the gap between the maximal and nonmaximal cases is eliminated for the HCRB. Numerical simulations are performed to verify the theoretical results in this paper.
OracleOrder Recovery Performance of Greedy Pursuits With Replacement Against General Perturbations
 Signal Processing, IEEE Transactions on
, 2013
"... Applying the theory of compressive sensing in practice always takes different kinds of perturbations into consideration. In this paper, the recovery performance of greedy pursuits with replacement for sparse recovery is analyzed when both the measurement vector and the sensing matrix are contaminate ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Applying the theory of compressive sensing in practice always takes different kinds of perturbations into consideration. In this paper, the recovery performance of greedy pursuits with replacement for sparse recovery is analyzed when both the measurement vector and the sensing matrix are contaminated with additive perturbations. Specifically, greedy pursuits with replacement include three algorithms, compressive sampling matching pursuit (CoSaMP), subspace pursuit (SP), and iterative hard thresholding (IHT), where the support estimation is evaluated and updated in each iteration. Based on restricted isometry property, a unified form of the error bounds of these recovery algorithms is derived under general perturbations for compressible signals. The results reveal that the recovery performance is stable against both perturbations. In addition, these bounds are compared with that of oracle recovery — least squares solution with the locations of some largest entries in magnitude known a priori. The comparison shows that the error bounds of these algorithms only differ in coefficients from the lower bound of oracle recovery for some certain signal and perturbations, as reveals that oracleorder recovery performance of greedy pursuits with replacement is guaranteed. Numerical simulations are performed to verify the conclusions.
Noise Folding in Completely Perturbed Compressed Sensing
"... This paper first presents a new generally perturbed compressed sensing (CS) model = ( + )( + ) + , which incorporated a general nonzero perturbation into sensing matrix and a noise into signal simultaneously based on the standard CS model = + and is called noise folding in completely perturbed CS m ..."
Abstract
 Add to MetaCart
(Show Context)
This paper first presents a new generally perturbed compressed sensing (CS) model = ( + )( + ) + , which incorporated a general nonzero perturbation into sensing matrix and a noise into signal simultaneously based on the standard CS model = + and is called noise folding in completely perturbed CS model. Our construction mainly will whiten the new proposed CS model and explore in restricted isometry property (RIP) and coherence of the new CS model under some conditions. Finally, we use OMP to give a numerical simulation which shows that our model is feasible although the recovered value of signal is not exact compared with original signal because of measurement noise , signal noise , and perturbation involved.
Performance Estimation of Compressed Sensing with Oracle Information
, 2015
"... This article discusses the performance of the oracle receiver in terms of the normalized mean square error in the sparse signal recovery process. The measurement is conducted in completely perturbed scenarios where the system is disturbed simultaneously by a perturbation matrix exerted on the sens ..."
Abstract
 Add to MetaCart
(Show Context)
This article discusses the performance of the oracle receiver in terms of the normalized mean square error in the sparse signal recovery process. The measurement is conducted in completely perturbed scenarios where the system is disturbed simultaneously by a perturbation matrix exerted on the sensing matrix, a noise vector on the result of measurement and the input noise added directly on the signal. In a bid to achieve concrete results, the entries of the sensing matrix are specified as Bernoulli random variables. The article introduces and proves the lower and upper bounds of the mean square error of the reconstructed signal. Those theoretical bounds hold in high probability for high dimensional signals. Numerical results verified the conclusions, illustrating both the lower and upper bounds as close and meaningful estimations of the mean square error in the Bernoulli case. The result is then compared with previous works in literature on Gaussian sensing matrices. An estimation of the average recovery error is derived as a generalization of the conclusion in [13] for the Gaussian case.