Results 1  10
of
59
Bounding the VapnikChervonenkis dimension of concept classes parameterized by real numbers
 Machine Learning
, 1995
"... Abstract. The VapnikChervonenkis (VC) dimension is an important combinatorial tool in the analysis of learning problems in the PAC framework. For polynomial learnability, we seek upper bounds on the VC dimension that are polynomial in the syntactic complexity of concepts. Such upper bounds are au ..."
Abstract

Cited by 83 (1 self)
 Add to MetaCart
Abstract. The VapnikChervonenkis (VC) dimension is an important combinatorial tool in the analysis of learning problems in the PAC framework. For polynomial learnability, we seek upper bounds on the VC dimension that are polynomial in the syntactic complexity of concepts. Such upper bounds are automatic for discrete concept classes, but hitherto little has been known about what general conditions guarantee polynomial bounds on VC dimension for classes in which concepts and examples are represented by tuples of real numbers. In this paper, we show that for two general kinds of concept class the VC dimension is polynomially bounded in the number of real numbers used to define a problem instance. One is classes where the criterion for membership of an instance in a concept can be expressed as a formula (in the firstorder theory of the reals) with fixed quantification depth and exponentiallybounded length, whose atomic predicates are polynomial inequalities of exponentiallybounded degree. The other is classes where containment of an instance in a concept is testable in polynomial time, assuming we may compute standard arithmetic operations on reals exactly in constant time. Our results show that in the continuous case, as in the discrete, the real barrier to efficient learning in the Occam sense is complexitytheoretic and not informationtheoretic. We present examples to show how these results apply to concept classes defined by geometrical figures and neural nets, and derive polynomial bounds on the VC dimension for these classes. Keywords: Concept learning, information theory, VapnikChervonenkis dimension, Milnor’s theorem 1.
Linear decision trees: volume estimates and topological bounds
 PROC. 24TH ACM SYMP. ON THEORY OF COMPUTING
, 1992
"... We describe two methods for estimating the size and depth of decision trees where a linear test is performed at each node. Both methods are applied to the question of deciding, by a linear decision tree, whether given n real numbers, some k of them are equal. We show that the minimum depth of a line ..."
Abstract

Cited by 47 (5 self)
 Add to MetaCart
(Show Context)
We describe two methods for estimating the size and depth of decision trees where a linear test is performed at each node. Both methods are applied to the question of deciding, by a linear decision tree, whether given n real numbers, some k of them are equal. We show that the minimum depth of a linear decision tree for this problem is Θ(nlog(n/k)). The upper bound is easy; the lower bound can be established for k = O(n 1/4−ε) by a volume argument; for the whole range, however, our proof is more complicated and it involves the use of some topology as well as the theory of Möbius functions.
New Lower Bounds for Hopcroft's Problem
, 1996
"... We establish new lower bounds on the complexity of the following basic geometric problem, attributed to John Hopcroft: Given a set of n points and m hyperplanes in R d , is any point contained in any hyperplane? We define a general class of partitioning algorithms, and show that in the worst cas ..."
Abstract

Cited by 33 (6 self)
 Add to MetaCart
We establish new lower bounds on the complexity of the following basic geometric problem, attributed to John Hopcroft: Given a set of n points and m hyperplanes in R d , is any point contained in any hyperplane? We define a general class of partitioning algorithms, and show that in the worst case, for all m and n, any such algorithm requires time #(n log m+n 2/3 m 2/3 +m log n) in two dimensions, or #(n log m+n 5/6 m 1/2 +n 1/2 m 5/6 + m log n) in three or more dimensions. We obtain slightly higher bounds for the counting version of Hopcroft's problem in four or more dimensions. Our planar lower bound is within a factor of 2 O(log # (n+m)) of the best known upper bound, due to Matousek. Previously, the best known lower bound, in any dimension, was #(n log m + m log n). We develop our lower bounds in two stages. First we define a combinatorial representation of the relative order type of a set of points and hyperplanes, called a monochromatic cover, and derive low...
Better Lower Bounds on Detecting Affine and Spherical Degeneracies
, 1995
"... We show that in the worst case,\Omega (n d ) sidedness queries are required to determine whether a set of n points in IR d is affinely degenerate, i.e., whether it contains d +1points on a common hyperplane. This matches known upper bounds. Wegive a straightforward adversary argument, b ..."
Abstract

Cited by 26 (9 self)
 Add to MetaCart
(Show Context)
We show that in the worst case,\Omega (n d ) sidedness queries are required to determine whether a set of n points in IR d is affinely degenerate, i.e., whether it contains d +1points on a common hyperplane. This matches known upper bounds. Wegive a straightforward adversary argument, based on the explicit construction of a point set containing\Omega (n d ) "collapsible" simplices, anyoneof which can be made degenerate without changing the orientation of anyother simplex. As an immediate corollary,wehavean\Omega (n d )lower bound on the number of sidedness queries required to determine the order type of a set of n points in IR d . Using similar techniques, wealsoshowthat\Omega (n d+1 ) insphere queries are required to decide the existence of spherical degeneracies in a set of n points in IR d . 1 Introduction A fundamental problem in computational geometry is determining whether a given set of points is in "general position." A simple example of ...
A Lower Bound for Randomized Algebraic Decision Trees
 PROC. 28TH ACM STOC
, 1996
"... We prove the first nontrivial (and superlinear) lower bounds on the depth of randomized algebraic decision trees (with twosided error) for problems being finite unions of hyperplanes and intersections of halfspaces, solving a long standing open problem. As an application, among other things, we ..."
Abstract

Cited by 23 (12 self)
 Add to MetaCart
We prove the first nontrivial (and superlinear) lower bounds on the depth of randomized algebraic decision trees (with twosided error) for problems being finite unions of hyperplanes and intersections of halfspaces, solving a long standing open problem. As an application, among other things, we derive, for the first time, an \Omega\Gamma n 2 ) randomized lower bound for the Knapsack Problem, and an \Omega\Gamma n log n) randomized lower bound for the Element Distinctness Problem which were previously known only for deterministic algebraic decision trees. It is worth noting that for the languages being finite unions of hyperplanes our proof method yields also a new elementary lower bound technique for deterministic algebraic decision trees without making use of Milnor's bound on Betti number of algebraic varieties.
On Showing Lower Bounds for ExternalMemory Computational Geometry Problems
"... . In this paper we consider lower bounds for externalmemory computational geometry problems. We find that it is not quite clear which model of computation to use when considering such problems. As an attempt of providing a model, we define the external memory Turing machine model, and we derive low ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
. In this paper we consider lower bounds for externalmemory computational geometry problems. We find that it is not quite clear which model of computation to use when considering such problems. As an attempt of providing a model, we define the external memory Turing machine model, and we derive lower bounds for a number of problems, including the element distinctness problem, in this model. For these lower bounds we make the standard assumption that records are indivisible. Waiving the indivisibility assumption we show how to beat the lower bound for element distinctness. As an alternative model, we briefly discuss an externalmemory version of the algebraic computation tree. 1. Introduction The Input/Output (or just I/O) communication between fast internal memory and slower external storage is the bottleneck in many largescale computations. The significance of this bottleneck is increasing as internal computation gets faster, and as parallel computation gains popularity. Currently,...