Results 1  10
of
22
Centroidal Voronoi tessellations: Applications and algorithms
 SIAM Rev
, 1999
"... Abstract. A centroidal Voronoi tessellation is a Voronoi tessellation whose generating points are the centroids (centers of mass) of the corresponding Voronoi regions. We give some applications of such tessellations to problems in image compression, quadrature, finite difference methods, distributio ..."
Abstract

Cited by 241 (25 self)
 Add to MetaCart
Abstract. A centroidal Voronoi tessellation is a Voronoi tessellation whose generating points are the centroids (centers of mass) of the corresponding Voronoi regions. We give some applications of such tessellations to problems in image compression, quadrature, finite difference methods, distribution of resources, cellular biology, statistics, and the territorial behavior of animals. We discuss methods for computing these tessellations, provide some analyses concerning both the tessellations and the methods for their determination, and, finally, present the results of some numerical experiments.
Regression Modeling in BackPropagation and Projection Pursuit Learning
, 1994
"... We studied and compared two types of connectionist learning methods for modelfree regression problems in this paper. One is the popular backpropagation learning (BPL) well known in the artificial neural networks literature; the other is the projection pursuit learning (PPL) emerged in recent years ..."
Abstract

Cited by 66 (1 self)
 Add to MetaCart
We studied and compared two types of connectionist learning methods for modelfree regression problems in this paper. One is the popular backpropagation learning (BPL) well known in the artificial neural networks literature; the other is the projection pursuit learning (PPL) emerged in recent years in the statistical estimation literature. Both the BPL and the PPL are based on projections of the data in directions determined from interconnection weights. However, unlike the use of fixed nonlinear activations (usually sigmoidal) for the hidden neurons in BPL, the PPL systematically approximates the unknown nonlinear activations. Moreover, the BPL estimates all the weights simultaneously at each iteration, while the PPL estimates the weights cyclically (neuronbyneuron and layerbylayer) at each iteration. Although the BPL and the PPL have comparable training speed when based on a GaussNewton optimization algorithm, the PPL proves more parsimonious in that the PPL requires a fewer hi...
Nonparametric Multivariate Density Estimation: A Comparative Study
 IEEE Trans. Signal Processing
, 1994
"... This paper algorithmically and empirically studies two major types of nonparametric multivariate density estimation techniques, where no assumption is made about the data being drawn from any of known parametric families of distribution. The first type is the popular kernel method (and several of it ..."
Abstract

Cited by 44 (1 self)
 Add to MetaCart
This paper algorithmically and empirically studies two major types of nonparametric multivariate density estimation techniques, where no assumption is made about the data being drawn from any of known parametric families of distribution. The first type is the popular kernel method (and several of its variants) which uses locally tuned radial basis (e.g., Gaussian) functions to interpolate the multidimensional density; the second type is based on an exploratory projection pursuit technique which interprets the multidimensional density through the construction of several onedimensional densities along highly "interesting" projections of multidimensional data. Performance evaluations using training data from mixture Gaussian and mixture Cauchy densities are presented. The results show that the curse of dimensionality and the sensitivity of control parameters have a much more adverse impact on the kernel density estimators than on the projection pursuit density estimators. 3 This rese...
A review of dimension reduction techniques
, 1997
"... The problem of dimension reduction is introduced as a way to overcome the curse of the dimensionality when dealing with vector data in highdimensional spaces and as a modelling tool for such data. It is defined as the search for a lowdimensional manifold that embeds the highdimensional data. A cl ..."
Abstract

Cited by 31 (4 self)
 Add to MetaCart
The problem of dimension reduction is introduced as a way to overcome the curse of the dimensionality when dealing with vector data in highdimensional spaces and as a modelling tool for such data. It is defined as the search for a lowdimensional manifold that embeds the highdimensional data. A classification of dimension reduction problems is proposed. A survey of several techniques for dimension reduction is given, including principal component analysis, projection pursuit and projection pursuit regression, principal curves and methods based on topologically continuous maps, such as Kohonen’s maps or the generalised topographic mapping. Neural network implementations for several of these techniques are also reviewed, such as the projection pursuit learning network and the BCM neuron with an objective function. Several appendices complement the mathematical treatment of the main text.
Projection pursuit for exploratory supervised classification
 Journal of Computational and Graphical Statistics
, 2004
"... ABSTRACT In highdimensional data, one often seeks a few interesting lowdimensional projections which reveal important aspects of the data. Projection pursuit is a procedure for searching highdimensional data for interesting lowdimensional projections via the optimization of a criterion function ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
ABSTRACT In highdimensional data, one often seeks a few interesting lowdimensional projections which reveal important aspects of the data. Projection pursuit is a procedure for searching highdimensional data for interesting lowdimensional projections via the optimization of a criterion function called the projection pursuit index. Very few projection pursuit indices incorporate class or group information in the calculation, and hence can be adequately applied to supervised classification problems. We introduce new indices derived from linear discriminant analysis that can be used for exploratory supervised classification.
High Dimensional Feature Reduction Via Projection Pursuit
, 1995
"...  iiTable of Contents ABSTRACT.................................................................................................................................... v 1. INTRODUCTION..................................................................................................................... 1 ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
 iiTable of Contents ABSTRACT.................................................................................................................................... v 1. INTRODUCTION..................................................................................................................... 1 1.1 Background.............................................................................................................. 1
ThreeDimensional Projection Pursuit
 J. Royal Statistical Society, Series C
, 1995
"... This article discusses various aspects of projection pursuit into three dimensions. The aim of projection pursuit is to find interesting linear combinations of variables in a multivariate data set. The precise definition of "interesting" is given later but clusters and other forms of nonlinear stru ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
This article discusses various aspects of projection pursuit into three dimensions. The aim of projection pursuit is to find interesting linear combinations of variables in a multivariate data set. The precise definition of "interesting" is given later but clusters and other forms of nonlinear structure are interesting. One and twodimensional projection pursuit have been dealt with extensively in the literature and some excellent software implementations are available. The benefit of projection into threedimensions is that more complex structures can be identified than with lowerdimensional projections. Projection pursuit into three dimensions is particularly attractive for two further perceptual reasons. Firstly, colours naturally correspond to 3vectors, for example through the RGB representation. Secondly, point clouds and other objects in three dimensions can be investigated on computer screens. For example through spinning 3D plots, which are immediately comprehensible because of our 3D intuition. These reasons are important when applying 3D projection pursuit to multispectral images (colour) and multivariate data sets (intuition). Section 2 briefly describes projection pursuit and includes details on projection indices and the process of sphering. Section 3 explains that we have chosen to extend Jones and Sibson's (1987) wellknown moments index into three dimensions because of its computational efficiency. The formulae for the moments index were analytically computed by the computer algebra package REDUCE (see Section 3.3). Section 3 also addresses the differentiation and optimization of the moments index, examines how outliers can be treated to provide better projection solutions and discusses how optimal projections can be rotated to give solutions that a...
Projection Pursuit for Discrete Data
, 1983
"... Abstract: This paper develops projection pursuit for discrete data using the discrete Radon transform. Discrete projection pursuit is presented as an exploratory method for finding informative low dimensional views of data such as binary vectors, rankings, phylogenetic trees or graphs. We show that ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract: This paper develops projection pursuit for discrete data using the discrete Radon transform. Discrete projection pursuit is presented as an exploratory method for finding informative low dimensional views of data such as binary vectors, rankings, phylogenetic trees or graphs. We show that for most data sets, most projections are close to uniform. Thus, informative summaries are ones deviating from uniformity. Syllabic data from several of Plato’s great works is used to illustrate the methods. Along with some basic distribution theory, an automated procedure for computing informative projections is introduced. 1.
Experiences with bivariate projection pursuit indices
, 1995
"... We investigate the behaviour of several popular projection pursuit indices for exploratory data analysis. Two indices, the FriedmanTukey and the Entropyindex, are based on kernel density estimation, while the Legendre, Hermite and NaturalHermiteindex are obtained by orthogonal series expansion ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We investigate the behaviour of several popular projection pursuit indices for exploratory data analysis. Two indices, the FriedmanTukey and the Entropyindex, are based on kernel density estimation, while the Legendre, Hermite and NaturalHermiteindex are obtained by orthogonal series expansions. Special emphasis is placed on the case of twodimensional projections. We apply the indices to two different datasets and discuss the choice of the bandwidth and the order of the expansions.