Results 1 - 10
of
1,185
Survey on Independent Component Analysis
- NEURAL COMPUTING SURVEYS
, 1999
"... A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the ..."
Abstract
-
Cited by 2309 (104 self)
- Add to MetaCart
A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation
Boosting the margin: A new explanation for the effectiveness of voting methods
- IN PROCEEDINGS INTERNATIONAL CONFERENCE ON MACHINE LEARNING
, 1997
"... One of the surprising recurring phenomena observed in experiments with boosting is that the test error of the generated classifier usually does not increase as its size becomes very large, and often is observed to decrease even after the training error reaches zero. In this paper, we show that this ..."
Abstract
-
Cited by 897 (52 self)
- Add to MetaCart
that techniques used in the analysis of Vapnik’s support vector classifiers and of neural networks with small weights can be applied to voting methods to relate the margin distribution to the test error. We also show theoretically and experimentally that boosting is especially effective at increasing the margins
Issues in Evolutionary Robotics
, 1992
"... In this paper we propose and justify a methodology for the development of the control systems, or `cognitive architectures', of autonomous mobile robots. We argue that the design by hand of such control systems becomes prohibitively difficult as complexity increases. We discuss an alternative a ..."
Abstract
-
Cited by 272 (33 self)
- Add to MetaCart
approach, involving artificial evolution, where the basic building blocks for cognitive architectures are adaptive noise-tolerant dynamical neural networks, rather than programs. These networks may be recurrent, and should operate in real time. Evolution should be incremental, using an extended
Mechanisms of visual attention in the human cortex.
- Annual Review of Neuroscience,
, 2000
"... Abstract A typical scene contains many different objects that, because of the limited processing capacity of the visual system, compete for neural representation. The competition among multiple objects in visual cortex can be biased by both bottom-up sensory-driven mechanisms and top-down influence ..."
Abstract
-
Cited by 246 (0 self)
- Add to MetaCart
Abstract A typical scene contains many different objects that, because of the limited processing capacity of the visual system, compete for neural representation. The competition among multiple objects in visual cortex can be biased by both bottom-up sensory-driven mechanisms and top
A New Evolutionary System for Evolving Artificial Neural Networks
- IEEE TRANSACTIONS ON NEURAL NETWORKS
, 1996
"... This paper presents a new evolutionary system, i.e., EPNet, for evolving artificial neural networks (ANNs). The evolutionary algorithm used in EPNet is based on Fogel's evolutionary programming (EP) [1], [2], [3]. Unlike most previous studies on evolving ANNs, this paper puts its emphasis on ev ..."
Abstract
-
Cited by 202 (35 self)
- Add to MetaCart
This paper presents a new evolutionary system, i.e., EPNet, for evolving artificial neural networks (ANNs). The evolutionary algorithm used in EPNet is based on Fogel's evolutionary programming (EP) [1], [2], [3]. Unlike most previous studies on evolving ANNs, this paper puts its emphasis
Addressing the Curse of Imbalanced Training Sets: One-Sided Selection
- In Proceedings of the Fourteenth International Conference on Machine Learning
, 1997
"... Adding examples of the majority class to the training set can have a detrimental effect on the learner's behavior: noisy or otherwise unreliable examples from the majority class can overwhelm the minority class. The paper discusses criteria to evaluate the utility of classifiers induced f ..."
Abstract
-
Cited by 234 (1 self)
- Add to MetaCart
as well as from researchers in artificial neural networks, AI, and ML. A typical scenario assumes the e...
Explanation-Based Neural Network Learning for Robot Control
- Advances in Neural Information Processing Systems 5
, 1993
"... How can artificial neural nets generalize better from fewer examples? In order to generalize successfully, neural network learning methods typically require large training data sets. We introduce a neural network learning method that generalizes rationally from many fewer data points, relying instea ..."
Abstract
-
Cited by 120 (22 self)
- Add to MetaCart
How can artificial neural nets generalize better from fewer examples? In order to generalize successfully, neural network learning methods typically require large training data sets. We introduce a neural network learning method that generalizes rationally from many fewer data points, relying
Ensembling Neural Networks: Many Could Be Better Than All
, 2002
"... Neural network ensemble is a learning paradigm where many neural networks are jointly used to solve a problem. In this paper, the relationship between the ensemble and its component neural networks is analyzed from the context of both regression and classification, which reveals that it may be bette ..."
Abstract
-
Cited by 128 (22 self)
- Add to MetaCart
Neural network ensemble is a learning paradigm where many neural networks are jointly used to solve a problem. In this paper, the relationship between the ensemble and its component neural networks is analyzed from the context of both regression and classification, which reveals that it may
Comparative Experiments on Disambiguating Word Senses: An Illustration of the Role of Bias in Machine Learning
, 1996
"... This paper describes an experimental comparison of seven different learning algorithms on the problem of learning to disambiguate the meaning of a word from context. The algorithms tested include statistical, neural-network, decision-tree, rule-based, and case-based classification techniques. The sp ..."
Abstract
-
Cited by 126 (2 self)
- Add to MetaCart
This paper describes an experimental comparison of seven different learning algorithms on the problem of learning to disambiguate the meaning of a word from context. The algorithms tested include statistical, neural-network, decision-tree, rule-based, and case-based classification techniques
Iterated local search
- Handbook of Metaheuristics, volume 57 of International Series in Operations Research and Management Science
, 2002
"... Iterated Local Search has many of the desirable features of a metaheuristic: it is simple, easy to implement, robust, and highly effective. The essential idea of Iterated Local Search lies in focusing the search not on the full space of solutions but on a smaller subspace defined by the solutions th ..."
Abstract
-
Cited by 172 (15 self)
- Add to MetaCart
that are locally optimal for a given optimization engine. The success of Iterated Local Search lies in the biased sampling of this set of local optima. How effective this approach turns out to be depends mainly on the choice of the local search, the perturbations, and the acceptance criterion. So far, in spite
Results 1 - 10
of
1,185