Results 1 
4 of
4
Regression Modeling in BackPropagation and Projection Pursuit Learning
, 1994
"... We studied and compared two types of connectionist learning methods for modelfree regression problems in this paper. One is the popular backpropagation learning (BPL) well known in the artificial neural networks literature; the other is the projection pursuit learning (PPL) emerged in recent years ..."
Abstract

Cited by 66 (1 self)
 Add to MetaCart
We studied and compared two types of connectionist learning methods for modelfree regression problems in this paper. One is the popular backpropagation learning (BPL) well known in the artificial neural networks literature; the other is the projection pursuit learning (PPL) emerged in recent years in the statistical estimation literature. Both the BPL and the PPL are based on projections of the data in directions determined from interconnection weights. However, unlike the use of fixed nonlinear activations (usually sigmoidal) for the hidden neurons in BPL, the PPL systematically approximates the unknown nonlinear activations. Moreover, the BPL estimates all the weights simultaneously at each iteration, while the PPL estimates the weights cyclically (neuronbyneuron and layerbylayer) at each iteration. Although the BPL and the PPL have comparable training speed when based on a GaussNewton optimization algorithm, the PPL proves more parsimonious in that the PPL requires a fewer hi...
Finite Precision Error Analysis of Neural Network Hardware Implementations
 IEEE Trans. on Computers
, 1993
"... this paper, and can be referred to [3]. 11 The operations in the forward retrieving of an Llayer perceptron can be formulated as a forward affine transformation interleaved with a nonlinear scalar activation function: x l+1;j = f( ..."
Abstract

Cited by 36 (0 self)
 Add to MetaCart
this paper, and can be referred to [3]. 11 The operations in the forward retrieving of an Llayer perceptron can be formulated as a forward affine transformation interleaved with a nonlinear scalar activation function: x l+1;j = f(
What's Wrong with A Cascaded Correlation Learning Network: A Projection Pursuit Learning Perspective
"... Cascaded correlation is a popular supervised learning architecture that dynamically grows layers of hidden neurons of fixed nonlinear activations (e.g., sigmoids), so that the network topology (size, depth) can be efficiently determined. Similar to a cascaded correlation learning network (CCLN), a p ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Cascaded correlation is a popular supervised learning architecture that dynamically grows layers of hidden neurons of fixed nonlinear activations (e.g., sigmoids), so that the network topology (size, depth) can be efficiently determined. Similar to a cascaded correlation learning network (CCLN), a projection pursuit learning network (PPLN) also dynamically grows the hidden neurons. Unlike a CCLN where cascaded connections from the existing hidden units to the new candidate hidden unit are required to establish highorder nonlinearity in approximating the residual error, a PPLN approximates the highorder nonlinearity by using (more flexible) trainable nonlinear nodal activation functions. Moreover, the maximum correlation training criterion used in a CCLN results in a poorer estimate of hidden weights when compared with the minimum mean squared error criterion used in a PPLN. The CCLN is thus excluded for most regression applications where smooth interpolation of functional values are ...
3D Heart Modeling and Motion Estimation Based on Continuous Distance Transform Neural Networks and Affine Transform
 Journal of VLSI Signal Processingâ€“systems For Signal, Image, and Video Technology
, 1998
"... In this paper, we apply the previously proposed continuous distance transform neural network (CDTNN) to represent 3D endocardial (inner) and epicardial (outer) contours and quantitatively estimate the motion of left ventricles of human hearts from ultrasound images acquired using transesophageal ec ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper, we apply the previously proposed continuous distance transform neural network (CDTNN) to represent 3D endocardial (inner) and epicardial (outer) contours and quantitatively estimate the motion of left ventricles of human hearts from ultrasound images acquired using transesophageal echocardiography. This CDTNN has many good properties as the conventional distance transforms, which are suitable for 3D object representation and deformation estimation. We have successfully represented the 3D epicardia and endocardia of left ventricles using CDTNNs trained by as few as 7.5% of the manually traced data. The mean absolute error in the testing for one patient over the 27 testing planes were (1:461:2 mm) for the endocardium, (1:4 6 1:2 mm) for the epicardium (1:3 6 1:0 mm) at end diastole and (1:4 6 1:2 mm) for the endocardium vs. 1:2 6 1:0 mm for the epicardium at end systole. The absolute error measured compares favorably with the human interobserver variability reported f...