Results 1 
3 of
3
Robust Solutions To LeastSquares Problems With Uncertain Data
, 1997
"... . We consider leastsquares problems where the coefficient matrices A; b are unknownbutbounded. We minimize the worstcase residual error using (convex) secondorder cone programming, yielding an algorithm with complexity similar to one singular value decomposition of A. The method can be interpret ..."
Abstract

Cited by 144 (12 self)
 Add to MetaCart
. We consider leastsquares problems where the coefficient matrices A; b are unknownbutbounded. We minimize the worstcase residual error using (convex) secondorder cone programming, yielding an algorithm with complexity similar to one singular value decomposition of A. The method can be interpreted as a Tikhonov regularization procedure, with the advantage that it provides an exact bound on the robustness of solution, and a rigorous way to compute the regularization parameter. When the perturbation has a known (e.g., Toeplitz) structure, the same problem can be solved in polynomialtime using semidefinite programming (SDP). We also consider the case when A; b are rational functions of an unknownbutbounded perturbation vector. We show how to minimize (via SDP) upper bounds on the optimal worstcase residual. We provide numerical examples, including one from robust identification and one from robust interpolation. Key Words. Leastsquares, uncertainty, robustness, secondorder cone...
... Identification and Model Quality Evaluation
 IEEE Transactions on Automatic Control
, 1997
"... Set membership H1 identification is investigated using timedomain data and mixed parametric and nonparametric models as well as supposing power bounded measurement errors. The problem of optimally estimating the unknown parameters and evaluating the minimal worst case identification error, called r ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Set membership H1 identification is investigated using timedomain data and mixed parametric and nonparametric models as well as supposing power bounded measurement errors. The problem of optimally estimating the unknown parameters and evaluating the minimal worst case identification error, called radius of information, is solved. For classes of models affine in the parameters, the radius of information is obtained as function of the H1 norm of the unmodeled dynamics. A method is given for estimating this norm from the available data and some general a priori information on the unmodeled dynamics, thus allowing the actual evaluation of the radius of information. The radius represents a measure of the "predictive ability" of the considered class of models, and it is then used for comparing the quality of different classes of models and for the order selection of their parametric part. The effectiveness of the proposed procedure is tested on some numerical examples and compared with stan...
On the WorstCase Divergence of the LeastSquares Algorithm
, 2001
"... In this paper, we provide a H1{norm lower bound on the worst{case identi cation error of least{squares estimation when using FIR model structures. This bound increases as a logarithmic function of model complexity and is valid for a wide class of inputs characterized as being quasi{stationary with ..."
Abstract
 Add to MetaCart
In this paper, we provide a H1{norm lower bound on the worst{case identi cation error of least{squares estimation when using FIR model structures. This bound increases as a logarithmic function of model complexity and is valid for a wide class of inputs characterized as being quasi{stationary with covariance function falling o suciently quickly.