Results 1 - 10
of
5,355
Reconstruction and Representation of 3D Objects with Radial Basis Functions
- Computer Graphics (SIGGRAPH ’01 Conf. Proc.), pages 67–76. ACM SIGGRAPH
, 2001
"... We use polyharmonic Radial Basis Functions (RBFs) to reconstruct smooth, manifold surfaces from point-cloud data and to repair incomplete meshes. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. Fast methods for fitting and evaluating RBFs al ..."
Abstract
-
Cited by 505 (1 self)
- Add to MetaCart
We use polyharmonic Radial Basis Functions (RBFs) to reconstruct smooth, manifold surfaces from point-cloud data and to repair incomplete meshes. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. Fast methods for fitting and evaluating RBFs
Poisson Surface Reconstruction
, 2006
"... We show that surface reconstruction from oriented points can be cast as a spatial Poisson problem. This Poisson formulation considers all the points at once, without resorting to heuristic spatial partitioning or blending, and is therefore highly resilient to data noise. Unlike radial basis function ..."
Abstract
-
Cited by 369 (5 self)
- Add to MetaCart
function schemes, our Poisson approach allows a hierarchy of locally supported basis functions, and therefore the solution reduces to a well conditioned sparse linear system. We describe a spatially adaptive multiscale algorithm whose time and space complexities are proportional to the size
A training algorithm for optimal margin classifiers
- PROCEEDINGS OF THE 5TH ANNUAL ACM WORKSHOP ON COMPUTATIONAL LEARNING THEORY
, 1992
"... A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of classifiaction functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjust ..."
Abstract
-
Cited by 1865 (43 self)
- Add to MetaCart
A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of classifiaction functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters
Training Support Vector Machines: an Application to Face Detection
, 1997
"... We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision sur ..."
Abstract
-
Cited by 727 (1 self)
- Add to MetaCart
We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision
Convolution Kernels on Discrete Structures
, 1999
"... We introduce a new method of constructing kernels on sets whose elements are discrete structures like strings, trees and graphs. The method can be applied iteratively to build a kernel on an infinite set from kernels involving generators of the set. The family of kernels generated generalizes the fa ..."
Abstract
-
Cited by 506 (0 self)
- Add to MetaCart
the family of radial basis kernels. It can also be used to define kernels in the form of joint Gibbs probability distributions. Kernels can be built from hidden Markov random elds, generalized regular expressions, pair-HMMs, or ANOVA decompositions. Uses of the method lead to open problems involving
The sources and consequences of embeddedness for the economic performance of organizations: The network effect
- American Sociological Review
, 1996
"... In this paper, I attempt to advance the concept of embeddedness beyond the level of a programmatic statement by developing a formulation that specifies how embeddedness and network structure affect economic action. On the basis of existing theory and original ethnographies of 23 apparel firms, I dev ..."
Abstract
-
Cited by 744 (8 self)
- Add to MetaCart
develop a systematic scheme that more fully demarcates the unique features, functions, and sources of embeddedness. From this scheme, I derive a set of refutable implications and test their plausibility, using another data set on the network ties of all better dress apparel firms in the New York apparel
A tutorial on support vector machines for pattern recognition
- Data Mining and Knowledge Discovery
, 1998
"... The tutorial starts with an overview of the concepts of VC dimension and structural risk minimization. We then describe linear Support Vector Machines (SVMs) for separable and non-separable data, working through a non-trivial example in detail. We describe a mechanical analogy, and discuss when SV ..."
Abstract
-
Cited by 3393 (12 self)
- Add to MetaCart
large (even infinite) VC dimension by computing the VC dimension for homogeneous polynomial and Gaussian radial basis function kernels. While very high VC dimension would normally bode ill for generalization performance, and while at present there exists no theory which shows that good generalization
Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems
- IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING
, 2007
"... Many problems in signal processing and statistical inference involve finding sparse solutions to under-determined, or ill-conditioned, linear systems of equations. A standard approach consists in minimizing an objective function which includes a quadratic (squared ℓ2) error term combined with a spa ..."
Abstract
-
Cited by 539 (17 self)
- Add to MetaCart
Many problems in signal processing and statistical inference involve finding sparse solutions to under-determined, or ill-conditioned, linear systems of equations. A standard approach consists in minimizing an objective function which includes a quadratic (squared ℓ2) error term combined with a
Benchmarking Least Squares Support Vector Machine Classifiers
- NEURAL PROCESSING LETTERS
, 2001
"... In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LS-SVMs), a least squares cost function is proposed so as to obtain a linear set of eq ..."
Abstract
-
Cited by 476 (46 self)
- Add to MetaCart
stage by gradually pruning the support value spectrum and optimizing the hyperparameters during the sparse approximation procedure. In this paper, twenty public domain benchmark datasets are used to evaluate the test set performance of LS-SVM classifiers with linear, polynomial and radial basis function
Verbs and Adverbs: Multidimensional Motion Interpolation Using Radial Basis Functions
- IEEE Computer Graphics and Applications
, 1998
"... This paper describes methods and data structures used to leverage motion sequences of complex linked figures. We present a technique for interpolating between example motions derived from live motion capture or produced through traditional animation tools. These motions can be characterized by emoti ..."
Abstract
-
Cited by 351 (5 self)
- Add to MetaCart
them, allowing an animated figure to exhibit a substantial repertoire of expressive behaviors. A combination of radial basis functions and low order polynomials is used to create the interpolation space between example motions. Inverse kinematic constraints are used to augment the interpolations
Results 1 - 10
of
5,355