Results 1  10
of
796
Results on Learnability and the VapnikChervonenkis Dimension
, 1991
"... We consider the problem of learning a concept from examples in the distributionfree model by Valiant. (An essentially equivalent model, if one ignores issues of computational difficulty, was studied by Vapnik and Chervonenkis.) We introduce the notion of dynamic sampling, wherein the number of examp ..."
Abstract

Cited by 33 (0 self)
 Add to MetaCart
of examples examined may increase with the complexity of the target concept. This method is used to establish the learnability of various concept classes with an infinite VapnikChervonenkis dimension. We also discuss an important variation on the problem of learning from examples, called approximating from
Sample compression, learnability, and the VapnikChervonenkis dimension
 MACHINE LEARNING
, 1995
"... Within the framework of paclearning, we explore the learnability of concepts from samples using the paradigm of sample compression schemes. A sample compression scheme of size k for a concept class C ` 2 X consists of a compression function and a reconstruction function. The compression function r ..."
Abstract

Cited by 83 (5 self)
 Add to MetaCart
class is paclearnable if and only if the VapnikChervonenkis (VC) dimension of the class i...
Bounding sample size with the VapnikChervonenkis dimension
 Discrete Applied Mathematics
, 1993
"... A proof that a concept is learnable provided the VapnikChervonenkis dimension is finite is given. The proof is more explicit than previous proofs and introduces two new parameters which allow bounds on the sample size obtained to be improved by a factor of approximately 4log 2(e). Keywords: learnin ..."
Abstract

Cited by 27 (2 self)
 Add to MetaCart
A proof that a concept is learnable provided the VapnikChervonenkis dimension is finite is given. The proof is more explicit than previous proofs and introduces two new parameters which allow bounds on the sample size obtained to be improved by a factor of approximately 4log 2(e). Keywords
On metric entropy, VapnikChervonenkis dimension, and learnability for a class of distributions
, 1989
"... In [23], Valiant proposed a formal framework for distributionfree concept learning which has generated a great deal of interest. A fundamental result regarding this framework was proved by Blumer et al. [6] characterizing those concept classes which are learnable in terms of their VapnikChervonenk ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
In [23], Valiant proposed a formal framework for distributionfree concept learning which has generated a great deal of interest. A fundamental result regarding this framework was proved by Blumer et al. [6] characterizing those concept classes which are learnable in terms of their VapnikChervonenkis
Bounding the VapnikChervonenkis dimension of concept classes parameterized by real numbers
 Machine Learning
, 1995
"... Abstract. The VapnikChervonenkis (VC) dimension is an important combinatorial tool in the analysis of learning problems in the PAC framework. For polynomial learnability, we seek upper bounds on the VC dimension that are polynomial in the syntactic complexity of concepts. Such upper bounds are au ..."
Abstract

Cited by 92 (1 self)
 Add to MetaCart
Abstract. The VapnikChervonenkis (VC) dimension is an important combinatorial tool in the analysis of learning problems in the PAC framework. For polynomial learnability, we seek upper bounds on the VC dimension that are polynomial in the syntactic complexity of concepts. Such upper bounds
Set Systems of Bounded VapnikChervonenkis Dimension and a Relation to Arrangements
, 1991
"... ..."
Combinatorial Variability of VapnikChervonenkis Classes with Applications to Sample Compression Schemes
 Discrete Applied Mathematics
, 1998
"... We define embeddings between concept classes that are meant to reflect certain aspects of their combinatorial structure. Furthermore, we introduce a notion of universal concept classes  classes into which any member of a given family of classes can be embedded. These universal classes play a role ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
given VCdimension can be grouped into subfamilies. We use these parameters to investigate the existence of embeddings and the scope of universality of classes. We view the formulation of these parameters and the related questions that they raise as a significant component in this work. A second theme
Scalesensitive Dimensions, Uniform Convergence, and Learnability
, 1997
"... Learnability in Valiant's PAC learning model has been shown to be strongly related to the existence of uniform laws of large numbers. These laws define a distributionfree convergence property of means to expectations uniformly over classes of random variables. Classes of realvalued functions ..."
Abstract

Cited by 240 (2 self)
 Add to MetaCart
'e, and Zinn's previous characterization as a corollary. Furthermore, it is the first based on a simple combinatorial quantity generalizing the VapnikChervonenkis dimension. We apply this result to obtain the weakest combinatorial condition known to imply PAC learnability in the statistical regression
Wrapper Induction for Information Extraction
, 1997
"... The Internet presents numerous sources of useful informationtelephone directories, product catalogs, stock quotes, weather forecasts, etc. Recently, many systems have been built that automatically gather and manipulate such information on a user's behalf. However, these resources are usually ..."
Abstract

Cited by 612 (30 self)
 Add to MetaCart
The Internet presents numerous sources of useful informationtelephone directories, product catalogs, stock quotes, weather forecasts, etc. Recently, many systems have been built that automatically gather and manipulate such information on a user's behalf. However, these resources are usually formatted for use by people (e.g., the relevant content is embedded in HTML pages), so extracting their content is difficult. Wrappers are often used for this purpose. A wrapper is a procedure for extracting a particular resource's content. Unfortunately, handcoding wrappers is tedious. We introduce wrapper induction, a technique for automatically constructing wrappers. Our techniques can be described in terms of three main contributions. First, we pose the problem of wrapper construction as one of inductive learn...
Results 1  10
of
796