## Multivariate Regression Depth (2000)

Citations: | 9 - 1 self |

### BibTeX

@MISC{Bern00multivariateregression,

author = {Marshall Bern and David Eppstein},

title = {Multivariate Regression Depth},

year = {2000}

}

### Years of Citing Articles

### OpenURL

### Abstract

The regression depth of a hyperplane with respect to a set of n points in R d is the minimum number of points the hyperplane must pass through in a rotation to vertical. We generalize hyperplane regression depth to k-flats for any k between 0 and d - 1. The k = 0 case gives the classical notion of center points. We prove that for any k and d, deep k-flats exist, that is, for any set of n points there always exists a k-flat with depth at least a constant fraction of n. As a consequence, we derive a linear-time (1 + #)-approximation algorithm for the deepest flat. 1. INTRODUCTION Linear regression asks for an affine subspace (a flat) that fits a set of data points. The most familiar case assumes d-1 independent or explanatory variables and one dependent or response variable, and fits a hyperplane to explain the dependent variable as a linear function of the independent variables. Quite often, however, there may be more than one dependent variable, and the multivariate regression p...

### Citations

434 | Least median of squares regression
- Rousseeuw
- 1984
(Show Context)
Citation Context ...egression is easily solved by treating each dependent variable separately, but this is not correct for other common forms of regression such as least absolute deviation [8] or least median of squares =-=[12]-=-. Rousseeuw and Hubert [14] introduced the notion of regression depth as a robust criterion for linear regression. The regression depth of a hyperplane H fitting a set of n points is the minimum numbe... |

74 |
A general approach to d-dimensional geometric queries
- Yao, Yao
- 1985
(Show Context)
Citation Context ...artitioning the point set vertically into equal thirds, and making a ham sandwich cut of the leftmost and rightmost 2n/3 points; (b) subdivision by three coincident lines into equal sixths. Lemma 1 ( =-=[9, 19]-=-) Let d be a constant, and assume we are given a set of n points in R d and a parameter p. Then we can partition the points into p subsets, with at most 2n/p points in each subset, such that any hyper... |

60 |
A generalization of Radon’s Theorem
- Tverberg
- 1981
(Show Context)
Citation Context ...of a point t is the maximum cardinality of any Tverberg partition for which the common intersection contains t. Note that the Tverberg depth is a lower bound on the location depth. Tverberg’s theorem =-=[17,18]-=- is that there always exists a point with Tverberg depth ⌈n/(d+1)⌉ (a Tverberg point); this result generalizes both the existence of center points (since any Tverberg point must be a center point) and... |

51 | Regression depth
- Rousseeuw, Hubert
- 1999
(Show Context)
Citation Context ...by treating each dependent variable separately, but this is not correct for other common forms of regression such as least absolute deviation [8] or least median of squares [12]. Rousseeuw and Hubert =-=[14]-=- introduced the notion of regression depth as a robust criterion for linear regression. The regression depth of a hyperplane H fitting a set of n points is the minimum number of points whose removal m... |

38 |
Efficient partition trees
- Matouˇsek
- 1992
(Show Context)
Citation Context ...artitioning the point set vertically into equal thirds, and making a ham sandwich cut of the leftmost and rightmost 2n/3 points; (b) subdivision by three coincident lines into equal sixths. Lemma 1 ( =-=[9, 19]-=-) Let d be a constant, and assume we are given a set of n points in R d and a parameter p. Then we can partition the points into p subsets, with at most 2n/p points in each subset, such that any hyper... |

37 |
Computing location depth and regression depth in higher dimensions
- Rousseeuw, Struyf
- 1998
(Show Context)
Citation Context ...ng [2] solved the conjecture using an argument based on Brouwer’s fixed-point theorem and a close connection between regression depth and center points. On the algorithmic front, Rousseeuw and Struyf =-=[15]-=- gave algorithms for testing the regression depth of a hyperplane. Their time bounds are exponential in the dimension, unsurprising since the problem is NP-complete for unbounded dimension [2]. For th... |

32 |
Mengen konvexer Körper, die einen gemeinsamen Punkt enthalten
- Radon
- 1921
(Show Context)
Citation Context ...ways exists a point with Tverberg depth ⌈n/(d+1)⌉ (a Tverberg point); this result generalizes both the existence of center points (since any Tverberg point must be a center point) and Radon’s theorem =-=[11]-=- that any d + 2 points have a Tverberg partition into two subsets. Another way of expressing Tverberg’s theorem is that for any point set we can find both a partition into ⌈n/(d + 1)⌉ subsets, and a p... |

25 |
A theorem on general measure
- Rado
- 1946
(Show Context)
Citation Context ...n choose the d + 1 subsets required by Lemma 1 to be the ones of size a. Open Problem 1 Prove tighter bounds on P(d) for d ≥ 2. 5 Deep k-Flats It is previously known that deep k-flats exist for k = 0 =-=[10]-=- and k = d − 1 [2,16]. In this section we show that such flats exist for all other values of k. We first need one more result, a common generalization of centerpoints and the ham sandwich theorem: Lem... |

17 |
Existence of equilibrium with incomplete markets
- Husseini, Lasry, et al.
- 1990
(Show Context)
Citation Context ...a topological sphere, and there can be continuous equivariant non-surjective functions from this space to itself. Nevertheless there might be a way of using generalizations of the Borsuk-Ulam theorem =-=[5]-=- or a modification of our Brouwer fixed point argument to show that the deep k-flat function must be surjective, perhaps using the additional property that a deep k-flat cannot be incident to Vd−k−1. ... |

15 | Depth in an arrangement of hyperplanes
- Rousseeuw, Hubert
- 1999
(Show Context)
Citation Context ...ast ⌈n/(d + 1)⌉ (a center point). Rousseeuw and Hubert provided a construction called the catline [4] for computing a regression line for a planar point set with depth at least ⌈n/3⌉, and conjectured =-=[13]-=- that in higher dimensions as well there should always exist a regression hyperplane of depth ⌈n/(d + 1)⌉. Steiger and Wenger [16] proved that a deep regression hyperplane always exists, but with a mu... |

14 | Regression depth and center points
- Amenta, Bern, et al.
(Show Context)
Citation Context ...on hyperplane of depth ⌈n/(d + 1)⌉. Steiger and Wenger [16] proved that a deep regression hyperplane always exists, but with a much smaller fraction than ⌈1/(d + 1)⌉. Amenta, Bern, Eppstein, and Teng =-=[2]-=- solved the conjecture using an argument based on Brouwer’s fixed-point theorem and a close connection between regression depth and center points. On the algorithmic front, Rousseeuw and Struyf [15] g... |

13 | Bounding the piercing number
- Alon, Kalai
- 1995
(Show Context)
Citation Context ...iven a set of n points in R d and a parameter p. Then we can partition the points into p subsets, with at most 2n/p points in each subset, such that any hyperplane cuts o(p) of the subsets. Lemma 2 ( =-=[1]-=-) Let p ≥ q > d be constants. Then there is a constant C(p, q, d) with the following property: If F is any family of point sets in R d , such that any p-tuple of sets in F contains a transversal subfa... |

12 | On depth and deep points : a calculus - Mizera - 2002 |

11 | Efficient Algorithms for Maximum Regression Depth
- Kreveld, Mitchell, et al.
- 1999
(Show Context)
Citation Context ... problem is NP-complete for unbounded dimension [2]. For the planar case, Van Kreveld, Mitchell, Rousseeuw, Sharir, Snoeyink, and Speckmann gave an O(n log 2 n) algorithm for computing a deepest line =-=[6]-=-. Langerman and Steiger [7] later improved this to an optimal O(n log n) time bound. 3 Definitions Although regression is naturally an affine rather than projective concept, our constructions and defi... |

10 | The catline for deep regression
- Hubert, Rousseeuw
- 1998
(Show Context)
Citation Context ... of a single-point estimator. It has long been known that there exists a point of location depth at least ⌈n/(d + 1)⌉ (a center point). Rousseeuw and Hubert provided a construction called the catline =-=[4]-=- for computing a regression line for a planar point set with depth at least ⌈n/3⌉, and conjectured [13] that in higher dimensions as well there should always exist a regression hyperplane of depth ⌈n/... |

5 | Continuity of halfspace depth contours and maximum depth estimators: diagnostics of depth-related methods - Mizera, Volauf - 2002 |

3 |
An O(n log n) algorithm for the hyperplane median
- Langerman, Steiger
- 2000
(Show Context)
Citation Context ... unbounded dimension [2]. For the planar case, Van Kreveld, Mitchell, Rousseeuw, Sharir, Snoeyink, and Speckmann gave an O(n log 2 n) algorithm for computing a deepest line [6]. Langerman and Steiger =-=[7]-=- later improved this to an optimal O(n log n) time bound. 3 Definitions Although regression is naturally an affine rather than projective concept, our constructions and definitions live most gracefull... |

3 |
Hyperplane depth and nested simplices
- Steiger, Wenger
- 1998
(Show Context)
Citation Context ...ine for a planar point set with depth at least ⌈n/3⌉, and conjectured [13] that in higher dimensions as well there should always exist a regression hyperplane of depth ⌈n/(d + 1)⌉. Steiger and Wenger =-=[16]-=- proved that a deep regression hyperplane always exists, but with a much smaller fraction than ⌈1/(d + 1)⌉. Amenta, Bern, Eppstein, and Teng [2] solved the conjecture using an argument based on Brouwe... |

2 | A generalization of the sandwich theorem - Dol’nikov - 1992 |

2 |
An extension of the ham sandwich theorem
- ˇZivaljević, Vrećica
- 1990
(Show Context)
Citation Context ...n we show that such flats exist for all other values of k. We first need one more result, a common generalization of centerpoints and the ham sandwich theorem: Lemma 3 (The Center Transversal Theorem =-=[3, 20]-=-) Let k + 1 point sets be given in R d , each containing at least m points, where 0 ≤ k < d. Then there exists a k-flat F such that any closed halfspace containing F contains at least ⌈m/(d − k + 1)⌉ ... |

1 |
Robustness of least distances estimate in multivariate linear models
- Liu
- 1992
(Show Context)
Citation Context ...s. Multivariate least-squares regression is easily solved by treating each dependent variable separately, but this is not correct for other common forms of regression such as least absolute deviation =-=[8]-=- or least median of squares [12]. Rousseeuw and Hubert [14] introduced the notion of regression depth as a robust criterion for linear regression. The regression depth of a hyperplane H fitting a set ... |