Results 1  10
of
2,390
Adaptation, performance and vapnikchervonenkis dimension of straight line programs
 In EuroGP
, 2009
"... Abstract. We discuss here empirical comparation between model selection methods based on Linear Genetic Programming. Two statistical methods are compared: model selection based on Empirical Risk Minimization (ERM) and model selection based on Structural Risk Minimization (SRM). For this purpose we h ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
have identified the main components which determine the capacity of some linear structures as classifiers showing an upper bound for the VapnikChervonenkis (VC) dimension of classes of programs representing linear code defined by arithmetic computations and sign tests. This upper bound is used
Using VapnikChervonenkis Dimension to Analyze the Testing Complexity of Program Segments
, 1994
"... We examine the complexity of testing different program constructs. We do this by defining a measure of testing complexity known as VCPdimension, which is similar to the VapnikChervonenkis dimension, and applying it to classes of programs, where all programs in a class share the same syntactic stru ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We examine the complexity of testing different program constructs. We do this by defining a measure of testing complexity known as VCPdimension, which is similar to the VapnikChervonenkis dimension, and applying it to classes of programs, where all programs in a class share the same syntactic
Bounding the VapnikChervonenkis dimension of concept classes parameterized by real numbers
 Machine Learning
, 1995
"... Abstract. The VapnikChervonenkis (VC) dimension is an important combinatorial tool in the analysis of learning problems in the PAC framework. For polynomial learnability, we seek upper bounds on the VC dimension that are polynomial in the syntactic complexity of concepts. Such upper bounds are au ..."
Abstract

Cited by 92 (1 self)
 Add to MetaCart
Abstract. The VapnikChervonenkis (VC) dimension is an important combinatorial tool in the analysis of learning problems in the PAC framework. For polynomial learnability, we seek upper bounds on the VC dimension that are polynomial in the syntactic complexity of concepts. Such upper bounds
Exact Rates In VapnikChervonenkis Bounds
"... VapnikChervonenkis bounds on speeds of uniform convergence of empirical means to their expectations have been continuously improved over the years since the precursory work in [26]. The result obtained by Talagrand in 1994 [21] seems to provide the final word as far as universal bounds are concerne ..."
Abstract
 Add to MetaCart
VapnikChervonenkis bounds on speeds of uniform convergence of empirical means to their expectations have been continuously improved over the years since the precursory work in [26]. The result obtained by Talagrand in 1994 [21] seems to provide the final word as far as universal bounds
VapnikChervonenkis entropy of the spherical perceptron
, 1996
"... Perceptron learning of randomly labeled patterns is analyzed using a Gibbs distribution on the set of realizable labelings of the patterns. The entropy ofthis distribution is an extension of the VapnikChervonenkis (VC) entropy, reducing to it exactly in the limit of in nite temperature. The close r ..."
Abstract
 Add to MetaCart
relationship between the VC and Gardner entropies can be seen within the replica formalism. There has been recent progress towards understanding the relationship between the statistical physics and VapnikChervonenkis (VC) approaches to learning theory[1, 2, 3, 4]. The two approaches can be uni ed in a
VapnikChervonenkis Dimension of Recurrent Neural Networks
, 1997
"... Most of the work on the VapnikChervonenkis dimension of neural networks has been focused on feedforward networks. However, recurrent networks are also widely used in learning applications, in particular when time is a relevant parameter. This paper provides lower and upper bounds for the VC dimensi ..."
Abstract

Cited by 24 (6 self)
 Add to MetaCart
Most of the work on the VapnikChervonenkis dimension of neural networks has been focused on feedforward networks. However, recurrent networks are also widely used in learning applications, in particular when time is a relevant parameter. This paper provides lower and upper bounds for the VC
The vapnikchervonenkis dimension of convex ngon classifiers
 Hungarian Electronic Journal of Sciences
, 2007
"... In statistical learning theory, the VapnikChervonenkis dimension is an important property of classifier families. With the help of this combinatoral concept it is possible to bound the error probability of a classifier, based on its performance on the training set. Convex polygon classifiers are R ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In statistical learning theory, the VapnikChervonenkis dimension is an important property of classifier families. With the help of this combinatoral concept it is possible to bound the error probability of a classifier, based on its performance on the training set. Convex polygon classifiers are R
REVlEW The VapnikChervonenkis Dimension: Information versus Complexity in Learning
"... When feasible, learning is a very attractive alternative to explicit programming. This is particularly true in areas where the problems do not lend themselves to systematic programming, such as pattern recognition in natural environments. The feasibility of learning an unknown function from examples ..."
Abstract
 Add to MetaCart
, they are conceptually and technically different. In the common language of learning, the information question is that of generalization and the complexity question is that of scaling. The work of Vapnik and Chervonenkis (1971) provides the key tools for dealing with the information issue. In this review, we develop
Set Systems of Bounded VapnikChervonenkis Dimension and a Relation to Arrangements
, 1991
"... ..."
Lower Bounds on the VapnikChervonenkis Dimension of Multilayer Threshold Networks
 In Proceedings of the Sixth Workshop on Computational Learning Theory
, 1993
"... We consider the problem of learning in multilayer feedforward networks of linear threshold units. We show that the VapnikChervonenkis dimension of the class of functions that can be computed by a twolayer threshold network with real inputs is at least proportional to the number of weights in the ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
We consider the problem of learning in multilayer feedforward networks of linear threshold units. We show that the VapnikChervonenkis dimension of the class of functions that can be computed by a twolayer threshold network with real inputs is at least proportional to the number of weights
Results 1  10
of
2,390