Results 1 
3 of
3
Regression Modeling in BackPropagation and Projection Pursuit Learning
, 1994
"... We studied and compared two types of connectionist learning methods for modelfree regression problems in this paper. One is the popular backpropagation learning (BPL) well known in the artificial neural networks literature; the other is the projection pursuit learning (PPL) emerged in recent years ..."
Abstract

Cited by 69 (1 self)
 Add to MetaCart
We studied and compared two types of connectionist learning methods for modelfree regression problems in this paper. One is the popular backpropagation learning (BPL) well known in the artificial neural networks literature; the other is the projection pursuit learning (PPL) emerged in recent years in the statistical estimation literature. Both the BPL and the PPL are based on projections of the data in directions determined from interconnection weights. However, unlike the use of fixed nonlinear activations (usually sigmoidal) for the hidden neurons in BPL, the PPL systematically approximates the unknown nonlinear activations. Moreover, the BPL estimates all the weights simultaneously at each iteration, while the PPL estimates the weights cyclically (neuronbyneuron and layerbylayer) at each iteration. Although the BPL and the PPL have comparable training speed when based on a GaussNewton optimization algorithm, the PPL proves more parsimonious in that the PPL requires a fewer hi...
What's Wrong with A Cascaded Correlation Learning Network: A Projection Pursuit Learning Perspective
"... Cascaded correlation is a popular supervised learning architecture that dynamically grows layers of hidden neurons of fixed nonlinear activations (e.g., sigmoids), so that the network topology (size, depth) can be efficiently determined. Similar to a cascaded correlation learning network (CCLN), a p ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Cascaded correlation is a popular supervised learning architecture that dynamically grows layers of hidden neurons of fixed nonlinear activations (e.g., sigmoids), so that the network topology (size, depth) can be efficiently determined. Similar to a cascaded correlation learning network (CCLN), a projection pursuit learning network (PPLN) also dynamically grows the hidden neurons. Unlike a CCLN where cascaded connections from the existing hidden units to the new candidate hidden unit are required to establish highorder nonlinearity in approximating the residual error, a PPLN approximates the highorder nonlinearity by using (more flexible) trainable nonlinear nodal activation functions. Moreover, the maximum correlation training criterion used in a CCLN results in a poorer estimate of hidden weights when compared with the minimum mean squared error criterion used in a PPLN. The CCLN is thus excluded for most regression applications where smooth interpolation of functional values are ...
Cooperative Modular Neural Network Classifiers
 Neurocomputing J
, 1996
"... The current generation of nonmodular neural network classifiers is unable to cope with classification problems which have a wide range of overlap among classes. This is due to the high coupling among the networks' hidden nodes. Modular neural network structures attempt to reduce this limitatio ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
The current generation of nonmodular neural network classifiers is unable to cope with classification problems which have a wide range of overlap among classes. This is due to the high coupling among the networks' hidden nodes. Modular neural network structures attempt to reduce this limitation via a divideandconquer approach. However, current modular designs are not offering a general alternative to the nonmodular approach because they do not provide a reasonable balance between subtasks simplification, and decisionmaking efficiency. While the taskdecomposition algorithm attempts to produce subtasks as simple as they can be, the modules are expected to give the multimodule decisionmaking strategy enough information to take an accurate global decision.