Results 11  20
of
208
A Comprehensive Survey of Fitness Approximation in Evolutionary Computation
, 2003
"... Evolutionary algorithms (EAs) have received increasing interests both in the academy and industry. One main difficulty in applying EAs to realworld applications is that EAs usually need a large number of fitness evaluations before a satisfying result can be obtained. However, fitness evaluations ar ..."
Abstract

Cited by 96 (6 self)
 Add to MetaCart
Evolutionary algorithms (EAs) have received increasing interests both in the academy and industry. One main difficulty in applying EAs to realworld applications is that EAs usually need a large number of fitness evaluations before a satisfying result can be obtained. However, fitness evaluations are not always straightforward in many realworld applications. Either an explicit fitness function does not exist, or the evaluation of the fitness is computationally very expensive. In both cases, it is necessary to estimate the fitness function by constructing an approximate model. In this paper, a comprehensive survey of the research on fitness approximation in evolutionary computation is presented. Main issues like approximation levels, approximate model management schemes, model construction techniques are reviewed. To conclude, open questions and interesting issues in the field are discussed.
Local Error Estimates for Radial Basis Function Interpolation of Scattered Data
 IMA J. Numer. Anal
, 1992
"... Introducing a suitable variational formulation for the local error of scattered data interpolation by radial basis functions OE(r), the error can be bounded by a term depending on the Fourier transform of the interpolated function f and a certain "Kriging function", which allows a formulation as an ..."
Abstract

Cited by 93 (20 self)
 Add to MetaCart
Introducing a suitable variational formulation for the local error of scattered data interpolation by radial basis functions OE(r), the error can be bounded by a term depending on the Fourier transform of the interpolated function f and a certain "Kriging function", which allows a formulation as an integral involving the Fourier transform of OE. The explicit construction of locally wellbehaving admissible coefficient vectors makes the Kriging function bounded by some power of the local density h of data points. This leads to error estimates for interpolation of functions f whose Fourier transform f is "dominated" by the nonnegative Fourier transform / of /(x) = OE(kxk) in the sense R j f j 2 / \Gamma1 dt ! 1. Approximation orders are arbitrarily high for interpolation with Hardy multiquadrics, inverse multiquadrics and Gaussian kernels. This was also proven in recent papers by Madych and Nelson, using a reproducing kernel Hilbert space approach and requiring the same h...
EigenSkin: Real Time Large Deformation Character Skinning in Hardware
 In ACM SIGGRAPH Symposium on Computer Animation
, 2002
"... We present a technique which allows subtle nonlinear quasistatic deformations of articulated characters to be compactly approximated by datadependent eigenbases which are optimized for real time rendering on commodity graphics hardware. The method extends the common SkeletalSubspace Deformation ( ..."
Abstract

Cited by 86 (4 self)
 Add to MetaCart
We present a technique which allows subtle nonlinear quasistatic deformations of articulated characters to be compactly approximated by datadependent eigenbases which are optimized for real time rendering on commodity graphics hardware. The method extends the common SkeletalSubspace Deformation (SSD) technique to provide efficient approximations of the complex deformation behaviours exhibited in simulated, measured, and artistdrawn characters. Instead of storing displacements for key poses (which may be numerous), we precompute principal components of the deformation influences for individual kinematic joints, and so construct erroroptimal eigenbases describing each joint's deformation subspace. Posedependent deformations are then expressed in terms of these reduced eigenbases, allowing precomputed coefficients of the eigenbasis to be interpolated at run time. Vertex program hardware can then efficiently render nonlinear skin deformations using a small number of eigendisplacements stored in graphics hardware. We refer to the final resulting character skinning construct as the model's EigenSkin. Animation results are presented for a very large nonlinear finite element model of a human hand rendered in real time at minimal cost to the main CPU.
Priors, Stabilizers and Basis Functions: from regularization to radial, tensor and additive splines
, 1993
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular we had discussed how standard smoothness functionals lead to a subclass of regularization networks, th ..."
Abstract

Cited by 78 (14 self)
 Add to MetaCart
We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular we had discussed how standard smoothness functionals lead to a subclass of regularization networks, the wellknown Radial Basis Functions approximation schemes. In this paper weshow that regularization networks encompass amuch broader range of approximation schemes, including many of the popular general additivemodels and some of the neural networks. In particular weintroduce new classes of smoothness functionals that lead to different classes of basis functions. Additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore, the same extension that leads from Radial Basis Functions (RBF) to Hyper Basis Functions (HBF) also leads from additivemodels to ridge approximation models, containing as special cases Breiman's hinge functions and some forms of Projection Pursuit Regression. We propose to use the term GeneralizedRegularization Networks for this broad class of approximation schemes that follow from an extension of regularization. In the probabilistic interpretation of regularization, the different classes of basis functions correspond to different classes of prior probabilities on the approximating function spaces, and therefore to differenttypes of smoothness assumptions. In the final part of the paper, weshow the relation between activation functions of the Gaussian and sigmoidal type by considering the simple case of the kernel G(x)=x. In summary,
A model of hippocampal function
, 1994
"... The firing rate maps of hippocampal place cells recorded in a freely moving rat are viewed as a set of approximate radial basis functions over the (2D) environment of the rat. It is proposed that these firing fields are constructed during exploration from 'sensory inputs' (tuning curve responses ..."
Abstract

Cited by 68 (7 self)
 Add to MetaCart
The firing rate maps of hippocampal place cells recorded in a freely moving rat are viewed as a set of approximate radial basis functions over the (2D) environment of the rat. It is proposed that these firing fields are constructed during exploration from 'sensory inputs' (tuning curve responses to the distance of cues from the rat) and used by cells downstream to construct firing rate maps that approximate any desired surface over the environment. It is shown that, when a rat moves freely in an open field, the phase of firing of a place cell (with respect to the EEG 0 rhythm) contains information as to the relative position of its firing field from the rat. A model of hippocampal function is presented in which the firing rate maps of cells downstream of the hippocampus provide a 'population vector' encoding the instantaneous direction of the rat from a previously encountered reward site, enabling navigation to it. A neuronal simulation, involving reinforcement only at the goal location, provides good agreement with single cell recording from the hippocampal region, and can navigate to reward sites in open fields using sensory input from environmental cues. The system requires only brief exploration, performs latent learning, and can return to a goal location after encountering it only once.
Novel ClusterBased Probability Model for Texture Synthesis, Classification, and Compression
 In Visual Communications and Image Processing
, 1993
"... We present a new probabilistic modeling technique for highdimensional vector sources, and consider its application to the problems of texture synthesis, classification, and compression. Our model combines kernel estimation with clustering, to obtain a semiparametric probability mass function estima ..."
Abstract

Cited by 67 (6 self)
 Add to MetaCart
We present a new probabilistic modeling technique for highdimensional vector sources, and consider its application to the problems of texture synthesis, classification, and compression. Our model combines kernel estimation with clustering, to obtain a semiparametric probability mass function estimate which summarizes  rather than contains  the training data. Because the model is cluster based, it is inferable from a limited set of training data, despite the model's high dimensionality. Moreover, its functional form allows recursive implementation that avoids exponential growth in required memory as the number of dimensions increases. Experimental results are presented for each of the three applications considered. 1. INTRODUCTION In many information processing tasks individual data samples exhibit a great deal of statistical interdependence, and should be treated jointly (e.g., in vectors) rather than separately. For some tasks this requires modeling vectors probabilistically....
Connectionist Probability Estimation in HMM Speech Recognition
 IEEE Transactions on Speech and Audio Processing
, 1992
"... This report is concerned with integrating connectionist networks into a hidden Markov model (HMM) speech recognition system, This is achieved through a statistical understanding of connectionist networks as probability estimators, first elucidated by Herve Bourlard. We review the basis of HMM speech ..."
Abstract

Cited by 62 (16 self)
 Add to MetaCart
This report is concerned with integrating connectionist networks into a hidden Markov model (HMM) speech recognition system, This is achieved through a statistical understanding of connectionist networks as probability estimators, first elucidated by Herve Bourlard. We review the basis of HMM speech recognition, and point out the possible benefits of incorporating connectionist networks. We discuss some issues necessary to the construction of a connectionist HMM recognition system, and describe the performance of such a system, including evaluations on the DARPA database, in collaboration with Mike Cohen and Horacio Franco of SRI International. In conclusion, we show that a connectionist component improves a state of the art HMM system. ii Part I INTRODUCTION Over the past few years, connectionist models have been widely proposed as a potentially powerful approach to speech recognition (e.g. Makino et al. (1983), Huang et al. (1988) and Waibel et al. (1989)). However, whilst connec...
Creating Surfaces from Scattered Data Using Radial Basis Functions
 in Mathematical Methods for Curves and Surfaces
, 1995
"... . This paper gives an introduction to certain techniques for the construction of geometric objects from scattered data. Special emphasis is put on interpolation methods using compactly supported radial basis functions. x1. Introduction We assume a sample of multivariate scattered data to be given a ..."
Abstract

Cited by 56 (11 self)
 Add to MetaCart
. This paper gives an introduction to certain techniques for the construction of geometric objects from scattered data. Special emphasis is put on interpolation methods using compactly supported radial basis functions. x1. Introduction We assume a sample of multivariate scattered data to be given as a set X = fx 1 ; : : : ; xN g of N pairwise distinct points x 1 ; : : : ; xN in IR d , called centers, together with N points y 1 ; : : : ; yN in IR D . An interpolating curve, surface, or solid to these data will be the range of a smooth function s : IR d oe\Omega ! IR D with s(x k ) = y k ; 1 k N: (1) Likewise, an approximating curve, surface, or solid will make the differences s(x j ) \Gamma y j small, for instance in the discrete L 2 sense, i.e. N X k=1 ks(x k ) \Gamma y k k 2 2 should be small. Curves, surfaces, and solids will only differ by their appropriate value of d = 1; 2, or 3. We use the term (geometric) objects to stand for curves, surfaces, or solids. Not...
Comparative Studies Of Metamodeling Techniques Under Multiple Modeling Criteria
 Structural and Multidisciplinary Optimization
, 2000
"... 1 Despite the advances in computer capacity, the enormous computational cost of complex engineering simulations makes it impractical to rely exclusively on simulation for the purpose of design optimization. To cut down the cost, surrogate models, also known as metamodels, are constructed from and ..."
Abstract

Cited by 53 (3 self)
 Add to MetaCart
1 Despite the advances in computer capacity, the enormous computational cost of complex engineering simulations makes it impractical to rely exclusively on simulation for the purpose of design optimization. To cut down the cost, surrogate models, also known as metamodels, are constructed from and then used in lieu of the actual simulation models. In the paper, we systematically compare four popular metamodeling techniquesPolynomial Regression, Multivariate Adaptive Regression Splines, Radial Basis Functions, and Krigingbased on multiple performance criteria using fourteen test problems representing different classes of problems. Our objective in this study is to investigate the advantages and disadvantages these four metamodeling techniques using multiple modeling criteria and multiple test problems rather than a single measure of merit and a single test problem. 1 Introduction Simulationbased analysis tools are finding increased use during preliminary design to explore desi...
Scattered Data Interpolation in Three or More Variables
 Mathematical Methods in Computer Aided Geometric Design
, 1989
"... This is a survey of techniques for the interpolation of scattered data in three or more independent variables. It covers schemes that can be used for any number of variables as well as schemes specifically designed for three variables. Emphasis is on breadth rather than depth, but there are expl ..."
Abstract

Cited by 47 (0 self)
 Add to MetaCart
This is a survey of techniques for the interpolation of scattered data in three or more independent variables. It covers schemes that can be used for any number of variables as well as schemes specifically designed for three variables. Emphasis is on breadth rather than depth, but there are explicit illustrations of different techniques used in the solution of multivariate interpolation problems. List of Contents 1. Introduction 2. Rendering of Trivariate Functions 3. Tensor Product Schemes 4. Point Schemes 4.1 Shepard's Methods 4.2 Radial Interpolants 4.2.1 Hardy Multiquadrics 4.2.2 Duchon Thin Plate Splines 5. Natural Neighbor Interpolation 6. kdimensional Triangulations 7. Tetrahedral Schemes 7.1 Polynomial Schemes 7.2 Rational Schemes 8. Simplicial Schemes 8.1 Polynomial Schemes 8.2 Rational Schemes 8.3 A Transfinite Scheme 9. Multivariate Splines 10. Transfinite Hypercubal Methods 11. Derivative Generation 12. Interpolation on the sphere and other surfa...