Results 1 
4 of
4
MinimumVolume Enclosing Ellipsoids and Core Sets
 JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS
, 2005
"... We study the problem of computing a (1 + #)approximation to the minimum volume enclosing ellipsoid of a given point set , p . Based on a simple, initial volume approximation method, we propose a modification of Khachiyan's firstorder algorithm. Our analysis leads to a slightly improved ..."
Abstract

Cited by 25 (4 self)
 Add to MetaCart
We study the problem of computing a (1 + #)approximation to the minimum volume enclosing ellipsoid of a given point set , p . Based on a simple, initial volume approximation method, we propose a modification of Khachiyan's firstorder algorithm. Our analysis leads to a slightly improved complexity bound of O(nd (0, 1). As a byproduct, our algorithm returns a core set with the property that the minimum volume enclosing ellipsoid of provides a good approximation to that of S.
A Practical Approximation Algorithm for the LMS Line Estimator
, 1997
"... The problem of fitting a straight line to a finite collection of points in the plane is an important problem in statistical estimation. Robust estimators are particularly important because of their lack of sensitivity to outlying data points. The basic measure of the robustness of an estimator is it ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
The problem of fitting a straight line to a finite collection of points in the plane is an important problem in statistical estimation. Robust estimators are particularly important because of their lack of sensitivity to outlying data points. The basic measure of the robustness of an estimator is its breakdown point, that is, the fraction (up to 50%) of outlying data points that can corrupt the estimator. Rousseeuw's least medianofsquares (LMS) regression (line) estimator [11] is among the best known 50% breakdown point estimators. The best exact algorithms known for this problem run in O(n 2 ) time, where n is the number of data points. Because of this high running time, many practitioners prefer to use a simple O(n log n) Monte Carlo algorithm, which is quite efficient but provides no guarantees of accuracy (even probabilistic) unless the data set satisfies certain assumptions. In this paper, we present two algorithms in an attempt to close the gap between theory and practice. ...
Computing the least median of squares estimator in time O(n d
 Proceedings of ICCSA 2005, LNCS 3480
, 2005
"... In modern statistics, the robust estimation of parameters of a regression hyperplane is a central problem, i. e., an estimation that is not or only slightly affected by outliers in the data. In this paper we will consider the least median of squares (LMS) estimator. For n points in d dimensions we d ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
In modern statistics, the robust estimation of parameters of a regression hyperplane is a central problem, i. e., an estimation that is not or only slightly affected by outliers in the data. In this paper we will consider the least median of squares (LMS) estimator. For n points in d dimensions we describe a randomized algorithm for LMS running in O � n d � time and O(n) space, for d fixed, and in time O � d 3 · (2n) d � and O(dn) space, for arbitrary d.
A note on Approximate Minimum Volume Enclosing Ellipsoid of Ellipsoids
"... We study the problem of computing the Minimum Volume Enclosing Ellipsoid (MVEE) containing a given set of ellipsoids S = {E1, E2,..., En} ⊆ Rd. We show how to efficiently compute a small set X ⊆ S of size at most α = X  = O ( d2) whose minimum volume ..."
Abstract
 Add to MetaCart
We study the problem of computing the Minimum Volume Enclosing Ellipsoid (MVEE) containing a given set of ellipsoids S = {E1, E2,..., En} ⊆ Rd. We show how to efficiently compute a small set X ⊆ S of size at most α = X  = O ( d2) whose minimum volume