## Robust Regression with Projection Based M-estimators (2003)

### Cached

### Download Links

- [www.caip.rutgers.edu]
- [www.caip.rutgers.edu]
- [lear.inrialpes.fr]
- DBLP

### Other Repositories/Bibliography

Venue: | In International Conference on Computer Vision |

Citations: | 33 - 7 self |

### BibTeX

@INPROCEEDINGS{Chen03robustregression,

author = {Haifeng Chen and Peter Meer},

title = {Robust Regression with Projection Based M-estimators},

booktitle = {In International Conference on Computer Vision},

year = {2003},

pages = {878--885}

}

### Years of Citing Articles

### OpenURL

### Abstract

The robust regression techniques in the RANSAC family are popular today in computer vision, but their performance depends on a user supplied threshold. We eliminate this drawback of RANSAC by reformulating another robust method, the M-estimator, as a projection pursuit optimization problem. The projection based pbM-estimator automatically derives the threshold from univariate kernel density estimates. Nevertheless, the performance of the pbM-estimator equals or exceeds that of RANSAC techniques tuned to the optimal threshold, a value which is never available in practice. Experiments were performed both with synthetic and real data in the affine motion and fundamental matrix estimation tasks.

### Citations

2696 |
R.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography
- Fischler, Bolles
- 1981
(Show Context)
Citation Context ...re of the noise corrupting the inliers (such as standard deviation or range), is the most frequently used additional assumption. The robust regression technique most popular in computer vision RANSAC =-=[4]-=- and its improved versions MSAC and MLESAC [15], [16], impose an upper bound on the scale, and the parameter estimates are found by maximizing the number of points (inliers) which can be placed within... |

1308 |
Robust Regression and Outlier Detection
- Rousseeuw, Leroy
- 1987
(Show Context)
Citation Context ...nd the parameter estimates are found by maximizing the number of points (inliers) which can be placed within this bound. The additional assumption behind the least median of squares (LMedS) estimator =-=[12]-=- and similar techniques is equivalent. A lower bound is imposed on the required percentage of inliers in the data, and the parameter estimates are found by minimizing the scale of data subsets of this... |

465 |
Kernel Smoothing
- Wand, Jones
- 1995
(Show Context)
Citation Context ...de explicit. The most popular approach for density estimation is the kernel density estimator, also known as the Parzen window method in pattern recognition. See [3, Sec.4.3] for an introduction, and =-=[17]-=- for a more complete discussion. The density estimate at location , computed with the kernels scaled to bandwidth is (6) The cardinal observati... |

340 | Determining the epipolar geometry and its uncertainty: a review
- Zhang
- 1998
(Show Context)
Citation Context ...32 MSAC ( ) 98/87 1.69 pbM 95/88 1.36 The points detected by the pbM-estimator were used to obtain an unbiased fundamental matrix estimate with a nonlinear method in the package [18] discussed in =-=[19]-=-. The average error computed by the program relative to the ground truth was 2.4 pixels. No ground truth is available for the castle sequence (Figure 6a). To have a reference fundamental matrix, first... |

266 | A.: MLESAC: a new robust estimator with application to estimating image geometry
- Torr, Zisserman
(Show Context)
Citation Context ...dard deviation or range), is the most frequently used additional assumption. The robust regression technique most popular in computer vision RANSAC [4] and its improved versions MSAC and MLESAC [15], =-=[16]-=-, impose an upper bound on the scale, and the parameter estimates are found by maximizing the number of points (inliers) which can be placed within this bound. The additional assumption behind the lea... |

230 | The Development and Comparison of Robust Methods for Estimating the Fundamental Matrix
- Torr, Murray
- 1997
(Show Context)
Citation Context ...s standard deviation or range), is the most frequently used additional assumption. The robust regression technique most popular in computer vision RANSAC [4] and its improved versions MSAC and MLESAC =-=[15]-=-, [16], impose an upper bound on the scale, and the parameter estimates are found by maximizing the number of points (inliers) which can be placed within this bound. The additional assumption behind t... |

85 |
Projection pursuit (with discussion
- Huber
- 1985
(Show Context)
Citation Context ...teresting” low-dimensional projections of multidimensional data. The informative value of a projection is measured with a projection index, such as the quantity inside the brackets in (7). The papers =-=[5]-=- [6] are surveys of all related topics. 2.1. Adaptive Mode Search An adaptive sampling strategy is employed for a computationally efficient search for the mode of in (8). This is illustrated ... |

83 | Heteroscedastic regression in computer vision: Problems with bilinear constraint
- Leedan, Meer
- 2000
(Show Context)
Citation Context ...ative to (1). It can be shown that the noise process associated with the “measurements” is point dependent, i.e., heteroscedastic, and therefore the TLS estimator is no longer the optimal technique =-=[8]-=-. The role of the pbM-estimator is thus limited to discriminate inliers from outliers. Furthermore, the rank two constraint of is also not taken into account. First, two far apart frames from the po... |

73 | Direct search methods: then and now
- Lewis, Torczon, et al.
- 2000
(Show Context)
Citation Context ...osen the simplex based direct search technique proposed in 1965 by Nelder and Mead [11, Sec.10.4]. Recently significant progress was reported in the literature for this class of search methods, e.g., =-=[9]-=-, but based on our experience there is no need to use more sophisticated (and more computationally intensive) techniques. The vector is a unit vector. The search therefore has to be restricted to th... |

57 | Guided sampling and consensus for motion estimation
- Tordoff, Murray
- 2002
(Show Context)
Citation Context ... data is high dimensional. A possible solution is guided sampling, in which additional information about the probability of a data point being an inlier is integrated into the sampling process, e.g., =-=[14]-=-. It is very important to recognize that while probabilistic sampling is a computational tool with no relation to the optimization criterion, guided sampling is a robust procedure since it also exploi... |

45 |
What is projection pursuit? (with discussion
- Jones, Sibson
- 1987
(Show Context)
Citation Context ...sting” low-dimensional projections of multidimensional data. The informative value of a projection is measured with a projection index, such as the quantity inside the brackets in (7). The papers [5] =-=[6]-=- are surveys of all related topics. 2.1. Adaptive Mode Search An adaptive sampling strategy is employed for a computationally efficient search for the mode of in (8). This is illustrated thro... |

28 | Robust computer vision through kernel density estimation
- Chen, Meer
- 2002
(Show Context)
Citation Context ...ision applications: affine motion and fundamental matrix estimation. 2. M-estimate Computation Using Projection Pursuit The principle behind the method discussed in this section was first proposed in =-=[1]-=-, as part of a different approach limited to low-dimensional data. See also Section 5. Here we provide the implementation for arbitrary dimensional data, and introduce a new robust regression techniqu... |

21 | Robust regression for data with multiple structures
- Chen, Meer, et al.
- 2001
(Show Context)
Citation Context ...se the algebraic and geometric distance of from the hyperplane have the same value (2) where is the orthogonal projection of on the plane. We have shown in =-=[2]-=-, that all the major robust regression techniques used in computer vision can be expressed as Mestimators. Taking into account (2) the M-estimator of the EIV model parameters is ss ... |

14 | Robustizing robust M-estimation using deterministic annealing
- Li
- 1996
(Show Context)
Citation Context ...ver, often there is not enough a priori knowledge to reliably define additional information. Embedding the robust estimator into a second optimization process over the range of possible bounds, e.g., =-=[10]-=-, is not a general enough solution. Indeed, whenever the employed assumptions are not valid, the robust regression may yield erroneous results, which in turn can corrupt the comparison across differen... |

6 | Robust affine motion estimation in joint image space using tensor voting
- Kang, Cohen, et al.
- 2002
(Show Context)
Citation Context ...ation between them (13) can be decoupled into two three-dimensional problems, in and respectively =-=[7]-=-, each obeying the EIV model (1). Thus, the noisy measurements of corresponding points are distributed around two planes in two different 3D spaces. The parameters of the affine transformations are es... |

1 | Comment on "Unmasking Multivariate Outliers and Leverage Points", by - Ruppert, Simpson - 1990 |

1 |
Software package for epipolar geometry computation. Available at www-sop.inria.fr/robotvis/ personnel/zzhang/software-FMatrix.html
- Zhang
(Show Context)
Citation Context ... ) 219/105 42.32 MSAC ( ) 98/87 1.69 pbM 95/88 1.36 The points detected by the pbM-estimator were used to obtain an unbiased fundamental matrix estimate with a nonlinear method in the package =-=[18]-=- discussed in [19]. The average error computed by the program relative to the ground truth was 2.4 pixels. No ground truth is available for the castle sequence (Figure 6a). To have a reference fundame... |

1 |
Comment on “Unmasking Multivariate Outliers and Leverage
- Ruppert, Simpson
- 1990
(Show Context)
Citation Context ...arameters of the model (1) are computed analytically. From the value of the projection index is obtained. To refine the estimate a local search around the neighborhood of s is then performed. In =-=[13]-=-, a similar idea of refinement, the use of line search, was advocated to stabilize the performance of LMedS. The optimization criterion (7) is nonlinear and nondifferentiable and therefore only search... |