Results 1  10
of
1,413
A Fast and Elitist MultiObjective Genetic Algorithm: NSGAII
, 2000
"... Multiobjective evolutionary algorithms which use nondominated sorting and sharing have been mainly criticized for their (i) O(MN computational complexity (where M is the number of objectives and N is the population size), (ii) nonelitism approach, and (iii) the need for specifying a sharing param ..."
Abstract

Cited by 1815 (60 self)
 Add to MetaCart
Multiobjective evolutionary algorithms which use nondominated sorting and sharing have been mainly criticized for their (i) O(MN computational complexity (where M is the number of objectives and N is the population size), (ii) nonelitism approach, and (iii) the need for specifying a sharing
A review of algebraic multigrid
, 2001
"... Since the early 1990s, there has been a strongly increasing demand for more efficient methods to solve large sparse, unstructured linear systems of equations. For practically relevant problem sizes, classical onelevel methods had already reached their limits and new hierarchical algorithms had to b ..."
Abstract

Cited by 347 (11 self)
 Add to MetaCart
Since the early 1990s, there has been a strongly increasing demand for more efficient methods to solve large sparse, unstructured linear systems of equations. For practically relevant problem sizes, classical onelevel methods had already reached their limits and new hierarchical algorithms had
Hierarchical Bayesian Inference in the Visual Cortex
, 2002
"... this paper, we propose a Bayesian theory of hierarchical cortical computation based both on (a) the mathematical and computational ideas of computer vision and pattern the ory and on (b) recent neurophysiological experimental evidence. We ,2 have proposed that Grenander's pattern theory 3 coul ..."
Abstract

Cited by 300 (2 self)
 Add to MetaCart
disambiguating low level representations? Rao and Ballard's predictive coding/Kalman filter model 6 did integrate generafive feedback in the perceptual inference process, but it was primarily a linear model and thus severely limited in practical utility. The datadriven Markov Chain Monte Carlo approach
Restoration of a Single Superresolution Image from Several Blurred, Noisy, and Undersampled Measured Images
, 1997
"... The three main tools in the single image restoration theory are the maximum likelihood (ML) estimator, the maximum a posteriori probability (MAP) estimator, and the set theoretic approach using projection onto convex sets (POCS). This paper utilizes the above known tools to propose a unified methodo ..."
Abstract

Cited by 267 (22 self)
 Add to MetaCart
and analyzed from the ML, the MAP, and POCS points of view, yielding a generalization of the known superresolution restoration methods. The proposed restoration approach is general but assumes explicit knowledge of the linear space and timevariant blur, the (additive Gaussian) noise, the different measured
A Pyramid Approach to SubPixel Registration Based on Intensity
, 1998
"... We present an automatic subpixel registration algorithm that minimizes the mean square intensity difference between a reference and a test data set, which can be either images (2D) or volumes (3D). It uses an explicit spline representation of the images in conjunction with spline processing, and ..."
Abstract

Cited by 237 (18 self)
 Add to MetaCart
, and is based on a coarsetofine iterative strategy (pyramid approach). The minimization is performed according to a new variation (ML*) of the MarquardtLevenberg algorithm for nonlinear leastsquare optimization. The geometric deformation model is a global 3D affine transformation that can be optionally
New EdgeDirected Interpolation
 IEEE Transactions on Image Processing
, 2001
"... This paper proposes an edgedirected interpolation algorithm for natural images. The basic idea is to first estimate local covariance coefficients from a lowresolution image and then use these covariance estimates to adapt the interpolation at a higher resolution based on the geometric duality betw ..."
Abstract

Cited by 244 (2 self)
 Add to MetaCart
This paper proposes an edgedirected interpolation algorithm for natural images. The basic idea is to first estimate local covariance coefficients from a lowresolution image and then use these covariance estimates to adapt the interpolation at a higher resolution based on the geometric duality
A SpaceSweep Approach to True Multi Image Matching
, 1995
"... The problem of determining feature correspondences across multiple views is considered. The term "true multiimage " matching is introduced to describe techniques that make full and efficient use of the geometric relationships between multiple images and the scene. A true multiimage techn ..."
Abstract

Cited by 221 (3 self)
 Add to MetaCart
image technique must generalize to any number of images, be of linear algorithmic complexity in the number of images, and use all the images in an equal manner. A new spacesweep approach to true multiimage matching is presented that simultaneously determines 2D feature correspondences and the 3D positions
Towards Exact Geometric Computation
, 1994
"... Exact computation is assumed in most algorithms in computational geometry. In practice, implementors perform computation in some fixedprecision model, usually the machine floatingpoint arithmetic. Such implementations have many wellknown problems, here informally called "robustness issues&quo ..."
Abstract

Cited by 96 (6 self)
 Add to MetaCart
". To reconcile theory and practice, authors have suggested that theoretical algorithms ought to be redesigned to become robust under fixedprecision arithmetic. We suggest that in many cases, implementors should make robustness a nonissue by computing exactly. The advantages of exact computation are too many
Support vector machines: Training and applications
 A.I. MEMO 1602, MIT A. I. LAB
, 1997
"... The Support Vector Machine (SVM) is a new and very promising classification technique developed by Vapnik and his group at AT&T Bell Laboratories [3, 6, 8, 24]. This new learning algorithm can be seen as an alternative training technique for Polynomial, Radial Basis Function and MultiLayer Perc ..."
Abstract

Cited by 223 (3 self)
 Add to MetaCart
The Support Vector Machine (SVM) is a new and very promising classification technique developed by Vapnik and his group at AT&T Bell Laboratories [3, 6, 8, 24]. This new learning algorithm can be seen as an alternative training technique for Polynomial, Radial Basis Function and Multi
Combining geometry and combinatorics: a unified approach to sparse signal recovery
, 2008
"... There are two main algorithmic approaches to sparse signal recovery: geometric and combinatorial. The geometric approach starts with a geometric constraint on the measurement matrix Φ and then uses linear programming to decode information about x from Φx. The combinatorial approach constructs Φ an ..."
Abstract

Cited by 157 (14 self)
 Add to MetaCart
There are two main algorithmic approaches to sparse signal recovery: geometric and combinatorial. The geometric approach starts with a geometric constraint on the measurement matrix Φ and then uses linear programming to decode information about x from Φx. The combinatorial approach constructs Φ
Results 1  10
of
1,413