Results 1 
9 of
9
Robust Solutions To LeastSquares Problems With Uncertain Data
, 1997
"... . We consider leastsquares problems where the coefficient matrices A; b are unknownbutbounded. We minimize the worstcase residual error using (convex) secondorder cone programming, yielding an algorithm with complexity similar to one singular value decomposition of A. The method can be interpret ..."
Abstract

Cited by 149 (13 self)
 Add to MetaCart
. We consider leastsquares problems where the coefficient matrices A; b are unknownbutbounded. We minimize the worstcase residual error using (convex) secondorder cone programming, yielding an algorithm with complexity similar to one singular value decomposition of A. The method can be interpreted as a Tikhonov regularization procedure, with the advantage that it provides an exact bound on the robustness of solution, and a rigorous way to compute the regularization parameter. When the perturbation has a known (e.g., Toeplitz) structure, the same problem can be solved in polynomialtime using semidefinite programming (SDP). We also consider the case when A; b are rational functions of an unknownbutbounded perturbation vector. We show how to minimize (via SDP) upper bounds on the optimal worstcase residual. We provide numerical examples, including one from robust identification and one from robust interpolation. Key Words. Leastsquares, uncertainty, robustness, secondorder cone...
Regularization by truncated total least squares
 SIAM J. Sci. Comp
, 1997
"... Abstract. The total least squares (TLS) method is a successful method for noise reduction in linear least squares problems in a number of applications. The TLS method is suited to problems in which both the coefficient matrix and the righthand side are not precisely known. This paper focuses on the ..."
Abstract

Cited by 39 (4 self)
 Add to MetaCart
Abstract. The total least squares (TLS) method is a successful method for noise reduction in linear least squares problems in a number of applications. The TLS method is suited to problems in which both the coefficient matrix and the righthand side are not precisely known. This paper focuses on the use of TLS for solving problems with very illconditioned coefficient matrices whose singular values decay gradually (socalled discrete illposed problems), where some regularization is necessary to stabilize the computed solution. We filter the solution by truncating the small singular values of the TLS matrix. We express our results in terms of the singular value decomposition (SVD) of the coefficient matrix rather than the augmented matrix. This leads to insight into the filtering properties of the truncated TLS method as compared to regularized least squares solutions. In addition, we propose and test an iterative algorithm based on Lanczos bidiagonalization for computing truncated TLS solutions.
Determining Rank in the Presence of Error
 IN
, 1993
"... The problem of determining rank in the presence of error occurs in a number of applications. The usual approach is to compute a rankrevealing decomposition and make a decision about the rank by examining the small elements of the decomposition. In this paper we look at three commonly use decomposit ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
The problem of determining rank in the presence of error occurs in a number of applications. The usual approach is to compute a rankrevealing decomposition and make a decision about the rank by examining the small elements of the decomposition. In this paper we look at three commonly use decompositions: the singular value decomposition, the pivoted QR decomposition, and the URV decomposition.
Core problems in linear algebraic systems
 SIAM. J. MATRIX ANAL. APPL
, 2006
"... For any linear system Ax ≈ b we define a set of core problems and show that the orthogonal upper bidiagonalization of [b, A] gives such a core problem. In particular we show that these core problems have desirable properties such as minimal dimensions. When a total least squares problem is solved b ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
For any linear system Ax ≈ b we define a set of core problems and show that the orthogonal upper bidiagonalization of [b, A] gives such a core problem. In particular we show that these core problems have desirable properties such as minimal dimensions. When a total least squares problem is solved by first finding a core problem, we show the resulting theory is consistent with earlier generalizations, but much simpler and clearer. The approach is important for other related solutions and leads, for example, to an elegant solution to the data least squares problem. The ideas could be useful for solving illposed problems.
Robust Regression and Lasso
"... We consider robust leastsquares regression with featurewise disturbance. We show that this formulation leads to tractable convex optimization problems, and we exhibit a particular uncertainty set for which the robust problem is equivalent to ℓ1 regularized regression (Lasso). This provides an inte ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
We consider robust leastsquares regression with featurewise disturbance. We show that this formulation leads to tractable convex optimization problems, and we exhibit a particular uncertainty set for which the robust problem is equivalent to ℓ1 regularized regression (Lasso). This provides an interpretation of Lasso from a robust optimization perspective. We generalize this robust formulation to consider more general uncertainty sets, which all lead to tractable convex optimization problems. Therefore, we provide a new methodology for designing regression algorithms, which generalize known formulations. The advantage is that robustness to disturbance is a physical property that can be exploited: in addition to obtaining new formulations, we use it directly to show sparsity properties of Lasso, as well as to prove a general consistency result for robust regression problems, including Lasso, from a unified robustness perspective. 1
Models For Robust Estimation And Identification
, 2003
"... In this paper, estimation and identification theories will be examined with the goal of determining some new methods of adding robustness. The focus will be upon uncertain estimation problems, namely ones in which the uncertainty multiplies the quantities to be estimated. Mathematically the problem ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper, estimation and identification theories will be examined with the goal of determining some new methods of adding robustness. The focus will be upon uncertain estimation problems, namely ones in which the uncertainty multiplies the quantities to be estimated. Mathematically the problem can be stated as, for system matrices and data matrices that lie in the sets (A + #A) and (b + #b) respectively, find the value of x that minimizes the cost (b + #b)#. The proposed techniques are compared with currently used methods such as Least Squares (LS), Total Least Squares (TLS), and Tikhonov Regularization (TR). Several results are presented and some future directions are suggested.
WorstCase Maximum Likelyhood Estimation in the Linear Model
, 2000
"... This paper addresses the problem of maximum likelyhood parameter estimation in linear models affected by structured deterministic uncertainty in the regression matrix and random gaussian noise. The proposed estimator maximizes the worstcase (with respect to the deterministic uncertainty) likelyhood ..."
Abstract
 Add to MetaCart
This paper addresses the problem of maximum likelyhood parameter estimation in linear models affected by structured deterministic uncertainty in the regression matrix and random gaussian noise. The proposed estimator maximizes the worstcase (with respect to the deterministic uncertainty) likelyhood of the measured sample. The estimate is computed solving a semidefinite optimization problem (SDP). In the particular case of unstructured uncertainty, this SDP simply requires minimization of a scalar convex function of one variable. 1 Introduction In this paper we will adress the problem of estimating an unknown parameter x 2 R n from measurement data y 2 R m , given apriori stochastic information on x and a mixed deterministicstochastic uncertain linear relation between x and y y = C (\Delta)x + d; where d 2 R m is a gaussian random vector. The literature on variations of this problem is of course vast, going from the early days of Gauss to very recent studies that address the ...