Results 1 
6 of
6
Regression Modeling in BackPropagation and Projection Pursuit Learning
, 1994
"... We studied and compared two types of connectionist learning methods for modelfree regression problems in this paper. One is the popular backpropagation learning (BPL) well known in the artificial neural networks literature; the other is the projection pursuit learning (PPL) emerged in recent years ..."
Abstract

Cited by 65 (1 self)
 Add to MetaCart
We studied and compared two types of connectionist learning methods for modelfree regression problems in this paper. One is the popular backpropagation learning (BPL) well known in the artificial neural networks literature; the other is the projection pursuit learning (PPL) emerged in recent years in the statistical estimation literature. Both the BPL and the PPL are based on projections of the data in directions determined from interconnection weights. However, unlike the use of fixed nonlinear activations (usually sigmoidal) for the hidden neurons in BPL, the PPL systematically approximates the unknown nonlinear activations. Moreover, the BPL estimates all the weights simultaneously at each iteration, while the PPL estimates the weights cyclically (neuronbyneuron and layerbylayer) at each iteration. Although the BPL and the PPL have comparable training speed when based on a GaussNewton optimization algorithm, the PPL proves more parsimonious in that the PPL requires a fewer hi...
Synthesis of Tactical Plans for Robotic Excavation
, 1995
"... this document are those of the author and should not be interpreted as representing the official policies, This thesis describes an approach to synthesizing plans for robotic excavators. Excavation tasks range from loading a pile of soil to cutting a geometrically described volume of earth — for a t ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
this document are those of the author and should not be interpreted as representing the official policies, This thesis describes an approach to synthesizing plans for robotic excavators. Excavation tasks range from loading a pile of soil to cutting a geometrically described volume of earth — for a trench or foundation footing. The excavation task can be stated in terms familiar to researchers in robotics and artificial intelligence: Transform the world from its current state to another state. Two important characteristics, however, distinguish the excavation domain. First, soil is deformable, and, hence, a complete statespace description of terrain to be excavated requires a highdimensional representation. But, for the statebased representations that traditional planners use, very large spaces are simply infeasible. Second, the response of soil varies immensely, and in general, it is not possible to analytically describe the mechanics of soil motion and its interaction with tools. A soil’s shear strength depends not only on its physical and chemical makeup, but also on factors such as the compaction experienced in the past. Thus a robotic excavator is forced to deal with approximate and incomplete models of its actions and their results.
What's Wrong with A Cascaded Correlation Learning Network: A Projection Pursuit Learning Perspective
"... Cascaded correlation is a popular supervised learning architecture that dynamically grows layers of hidden neurons of fixed nonlinear activations (e.g., sigmoids), so that the network topology (size, depth) can be efficiently determined. Similar to a cascaded correlation learning network (CCLN), a p ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Cascaded correlation is a popular supervised learning architecture that dynamically grows layers of hidden neurons of fixed nonlinear activations (e.g., sigmoids), so that the network topology (size, depth) can be efficiently determined. Similar to a cascaded correlation learning network (CCLN), a projection pursuit learning network (PPLN) also dynamically grows the hidden neurons. Unlike a CCLN where cascaded connections from the existing hidden units to the new candidate hidden unit are required to establish highorder nonlinearity in approximating the residual error, a PPLN approximates the highorder nonlinearity by using (more flexible) trainable nonlinear nodal activation functions. Moreover, the maximum correlation training criterion used in a CCLN results in a poorer estimate of hidden weights when compared with the minimum mean squared error criterion used in a PPLN. The CCLN is thus excluded for most regression applications where smooth interpolation of functional values are ...
Logistic Response Projection Pursuit
, 1993
"... A highly flexible nonparametric regression model for predicting a response y given covariates x is the projection pursuit regression (PPR) model y = h(x) = fi 0 + P j fi j f j (ff T j x), where the f j are general smooth functions with mean zero and norm one, and P d k=1 ff 2 kj = 1. With a ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
A highly flexible nonparametric regression model for predicting a response y given covariates x is the projection pursuit regression (PPR) model y = h(x) = fi 0 + P j fi j f j (ff T j x), where the f j are general smooth functions with mean zero and norm one, and P d k=1 ff 2 kj = 1. With a binary response y, the common approach to fitting a PPR model is to fit y to minimize average squared error without explicitly considering the binary nature of the response. We develop an alternative logistic response projection pursuit model, in which y is take to be binomial(p), where log( p 1\Gammap ) = h(x). This may be fit by minimizing either binomial deviance or average squared error. We compare the logistic response models to the linear model on simulated data. In addition, we develop a generalized projection pursuit framework for exponential family models. We also present a smoothing spline based PPR algorithm, and compare it to supersmoother and polynomial based PPR algorithms...
Initialization of Directions in Projection Pursuit Learning
"... Abstract — The Projection Pursuit Learner is a multiclass classifier that resembles a twolayer neural network in which the sigmoid activation functions of the hidden neurons have been replaced by an interpolating polynomial. This modification increases the flexibility of the model but also makes i ..."
Abstract
 Add to MetaCart
Abstract — The Projection Pursuit Learner is a multiclass classifier that resembles a twolayer neural network in which the sigmoid activation functions of the hidden neurons have been replaced by an interpolating polynomial. This modification increases the flexibility of the model but also makes it more inclined to get stuck in a local minimum during gradientbased training. This problem can be alleviated to a certain extent by replacing the random initialization of the parameters by proper heuristics. In this paper we propose to initialize the projection directions by means of feature space transformation methods such as independent component analysis (ICA), principal component analysis (PCA), linear discriminant analysis (LDA) and springy discriminant analysis (SDA). We find that with this refinement the number of processing units can be reduced by 10 40%. I.
UC405 (Ml INTERPRETABLE PROJECTION PURSUIT*
, 1989
"... The goal of this thesis is to modify projection pursuit by trading accuracy for interpretability. The modification produces a more parsimonious and understandable model without sacrificing the structure which projection pursuit seeks. The method retains the nonlinear versatility of projection pursui ..."
Abstract
 Add to MetaCart
The goal of this thesis is to modify projection pursuit by trading accuracy for interpretability. The modification produces a more parsimonious and understandable model without sacrificing the structure which projection pursuit seeks. The method retains the nonlinear versatility of projection pursuit while clarifying the results. Following an introduction which outlines the dissertation, the first and second chapters contain the technique as applied to exploratory projection pursuit and projection pursuit regression respectively. The interpretability of a description is measured as the simplicity of the coefficients which define its linear projections. Several interpretability indices for a set of vectors are defined based on the ideas of rotation in factor analysis and entropy. The two methods require slightly different indices due to their contrary goals. A roughness penalty weighting approach is used to search for a more parsimonious