Results 1  10
of
149
ROBUST REGRESSION COMPUTATION USING ITERATIVELY REWEIGHTED LEAST SQUARES*
"... Abstract. Several variants of Newton’s method are used to obtain estimates of solution vectors and residual vectors for the linear model Ax b / e btrue using an iteratively reweighted least squares criterion, which tends to diminish the influence of outliers compared with the standard least squares ..."
Abstract
 Add to MetaCart
for sparse wellconditioned and illconditioned problems. Key words, iteratively reweighted least squares, robust regression AMS(MOS) subject classifications. 62J05, 65F20 1. Introduction. Consider
LOWRANK MATRIX RECOVERY VIA ITERATIVELY REWEIGHTED LEAST SQUARES MINIMIZATION
"... Abstract. We present and analyze an efficient implementation of an iteratively reweighted least squares algorithm for recovering a matrix from a small number of linear measurements. The algorithm is designed for the simultaneous promotion of both a minimal nuclear norm and an approximatively lowran ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
cases, for instance for the matrix completion problem, our version of this algorithm can take advantage of the Woodbury matrix identity, which allows to expedite the solution of the least squares problems required at each iteration. We present numerical experiments which confirm the robustness
2D ITERATIVELY REWEIGHTED LEAST SQUARES LATTICE ALGORITHM AND ITS APPLICATION TO DEFECT DETECTION IN TEXTURED IMAGES*
"... Abstract In this paper, a 2D iteratively reweighted least squares lattice algorithm, which is robust to the outliers, is introduced and is applied to defect detection problem in textured images. First, the philosophy of using different optimization functions that results in weighted least squares ..."
Abstract
 Add to MetaCart
Abstract In this paper, a 2D iteratively reweighted least squares lattice algorithm, which is robust to the outliers, is introduced and is applied to defect detection problem in textured images. First, the philosophy of using different optimization functions that results in weighted least squares
Stochastic Gradient Boosting
 Computational Statistics and Data Analysis
, 1999
"... Gradient boosting constructs additive regression models by sequentially fitting a simple parameterized function (base learner) to current "pseudo"residuals by leastsquares at each iteration. The pseudoresiduals are the gradient of the loss functional being minimized, with respect to ..."
Abstract

Cited by 285 (1 self)
 Add to MetaCart
Gradient boosting constructs additive regression models by sequentially fitting a simple parameterized function (base learner) to current "pseudo"residuals by leastsquares at each iteration. The pseudoresiduals are the gradient of the loss functional being minimized, with respect
Preconditioning for Iterative Methods in Robust Linear Regression
, 2000
"... In this paper, we consider solving the robust linear regression problem with an inexact Newton Method combined with a preconditioned conjugate gradient least squares algorithm. The efficiency of this approach for solving large and sparse problems depends on the preconditioner. Preconditioners based ..."
Abstract
 Add to MetaCart
on lowrank updates or downdates of an existing matrix factorization are presented. Numerical results are given to demonstrate the effectiveness of these preconditioners. Key words: Robust regression, Iteratively reweighted least squares, Newton's method, Conjugate gradient least squares method
1Smoothed Low Rank and Sparse Matrix Recovery by Iteratively Reweighted Least Squares Minimization
"... This work presents a general framework for solving the low rank and/or sparse matrix minimization problems, which may involve multiple nonsmooth terms. The Iteratively Reweighted Least Squares (IRLS) method is a fast solver, which smooths the objective function and minimizes it by alternately updat ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This work presents a general framework for solving the low rank and/or sparse matrix minimization problems, which may involve multiple nonsmooth terms. The Iteratively Reweighted Least Squares (IRLS) method is a fast solver, which smooths the objective function and minimizes it by alternately
On the Properties of Preconditioners for Robust Linear Regression
, 2000
"... In this paper, we consider solving the robust linear regression problem y = Ax + ∈ by an inexact Newton method and an iteratively reweighted least squares method. We show that each of these methods can be combined with the preconditioned conjugate gradient least square algorithm to solve large, spar ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
In this paper, we consider solving the robust linear regression problem y = Ax + ∈ by an inexact Newton method and an iteratively reweighted least squares method. We show that each of these methods can be combined with the preconditioned conjugate gradient least square algorithm to solve large
Realtime visual tracking of complex structures
 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
, 2002
"... This paper presents a novel framework for threedimensional modelbased tracking. Graphical rendering technology is combined with constrained active contour tracking to create a robust wireframe tracking system. It operates in real time at video frame rate (25 Hz) on standard hardware. It is based ..."
Abstract

Cited by 226 (7 self)
 Add to MetaCart
the motion computation problem into simple geometric terms so that tracking becomes a simple optimization problem solved by means of iterative reweighted least squares. A visual servoing system constructed using this framework is presented together with results showing the accuracy of the tracker. The system
A New Function for Robust Linear Regression: An Iterative Approach
 16th IMACS WORLD CONGRESS 2000 on Scientific Computation, Applied Mathematics and Simulation
, 2000
"... In this paper, we consider solving the robust linear regression problem. We show that IRLS and Newton method can each be combined with preconditioned conjugate gradient least squares method to solve large, sparse, rectangular systems of linear, algebraic equations efficiently. We define a new functi ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
the effectiveness of preconditioners based on this function. Key words: Robust regression, Iteratively reweighted least squares, Newton's method, New weighting function, Conjugate gradient least squares method, Preconditioner. AMS subject classifications: 62J05, 65D10, 65F10, 65F20. 1 Introduction Consider
Convex Total Least Squares
"... Abstract We study the total least squares (TLS) problem that generalizes least squares regression by allowing measurement errors in both dependent and independent variables. TLS is widely used in applied fields including computer vision, system identification and econometrics. The special case when ..."
Abstract
 Add to MetaCart
Abstract We study the total least squares (TLS) problem that generalizes least squares regression by allowing measurement errors in both dependent and independent variables. TLS is widely used in applied fields including computer vision, system identification and econometrics. The special case
Results 1  10
of
149