Results 1  10
of
335
A Survey of Shape Analysis Techniques
 Pattern Recognition
, 1998
"... This paper provides a review of shape analysis methods. Shape analysis methods play an important role in systems for object recognition, matching, registration, and analysis. Researchin shape analysis has been motivated, in part, by studies of human visual form perception systems. ..."
Abstract

Cited by 200 (2 self)
 Add to MetaCart
This paper provides a review of shape analysis methods. Shape analysis methods play an important role in systems for object recognition, matching, registration, and analysis. Researchin shape analysis has been motivated, in part, by studies of human visual form perception systems.
Computational Models of Sensorimotor Integration
 SCIENCE
, 1997
"... The sensorimotor integration system can be viewed as an observer attempting to estimate its own state and the state of the environment by integrating multiple sources of information. We describe a computational framework capturing this notion, and some specific models of integration and adaptati ..."
Abstract

Cited by 178 (10 self)
 Add to MetaCart
The sensorimotor integration system can be viewed as an observer attempting to estimate its own state and the state of the environment by integrating multiple sources of information. We describe a computational framework capturing this notion, and some specific models of integration and adaptation that result from it. Psychophysical results from two sensorimotor systems, subserving the integration and adaptation of visuoauditory maps, and estimation of the state of the hand during arm movements, are presented and analyzed within this framework. These results suggest that: (1) Spatial information from visual and auditory systems is integrated so as to reduce the variance in localization. (2) The effects of a remapping in the relation between visual and auditory space can be predicted from a simple learning rule. (3) The temporal propagation of errors in estimating the hand's state is captured by a linear dynamic observer, providing evidence for the existence of an intern...
Non Linear Neurons in the Low Noise Limit: A Factorial Code Maximizes Information Transfer
, 1994
"... We investigate the consequences of maximizing information transfer in a simple neural network (one input layer, one output layer), focussing on the case of non linear transfer functions. We assume that both receptive fields (synaptic efficacies) and transfer functions can be adapted to the environm ..."
Abstract

Cited by 141 (18 self)
 Add to MetaCart
We investigate the consequences of maximizing information transfer in a simple neural network (one input layer, one output layer), focussing on the case of non linear transfer functions. We assume that both receptive fields (synaptic efficacies) and transfer functions can be adapted to the environment. The main result is that, for bounded and invertible transfer functions, in the case of a vanishing additive output noise, and no input noise, maximization of information (Linsker'sinfomax principle) leads to a factorial code  hence to the same solution as required by the redundancy reduction principle of Barlow. We show also that this result is valid for linear, more generally unbounded, transfer functions, provided optimization is performed under an additive constraint, that is which can be written as a sum of terms, each one being specific to one output neuron. Finally we study the effect of a non zero input noise. We find that, at first order in the input noise, assumed to be small ...
Natural Signal Statistics and Sensory Gain Control
 Nature Neuroscience
, 2001
"... The statistical properties of natural images suggest an optimal form of nonlinear decomposition, in which the image is decomposed using a set of linear filters at a variety of positions, scales and orientations, and these linear responses are then rectified and divided by a weighted sum of rectified ..."
Abstract

Cited by 130 (24 self)
 Add to MetaCart
The statistical properties of natural images suggest an optimal form of nonlinear decomposition, in which the image is decomposed using a set of linear filters at a variety of positions, scales and orientations, and these linear responses are then rectified and divided by a weighted sum of rectified responses of nearby filters. Such divisive normalization models have become widely used in modeling steadystate responses of neurons in primary visual cortex. In addition to providing a surprisingly good characterization of "typical" neurons, the statistically optimal version of the model is consistent with unusual changes in tuning properties of these neurons at different contrast levels. These results suggest that the nonlinear response properties of cortical neurons are not an accident of biophysical implementation, but serve an important functional role.
A Probabilistic Approach to Fast Pattern Matching in Time Series Databases
 Proceedings of the 3 rd International Conference of Knowledge Discovery and Data Mining
, 1997
"... The problem of efficiently and accurately locating patterns of interest in massive time series data sets is an important and nontrivial problem in a wide variety of applications, including diagnosis and monitoring of complex systems, biomedical data analysis, and exploratory data analysis in scient ..."
Abstract

Cited by 102 (15 self)
 Add to MetaCart
The problem of efficiently and accurately locating patterns of interest in massive time series data sets is an important and nontrivial problem in a wide variety of applications, including diagnosis and monitoring of complex systems, biomedical data analysis, and exploratory data analysis in scientific and business time series. In this paper a probabilistic approach is taken to this problem. Using piecewise linear segmentations as the underlying representation, local features (such as peaks, troughs, and plateaus) are defined using a prior distribution on expected deformations from a basic template. Global shape information is represented using another prior on the relative locations of the individual features. An appropriately defined probabilistic model integrates the local and global information and directly leads to an overall distance measure between sequence patterns based on prior knowledge. A search algorithm using this distance measure is shown to efficiently and accurately f...
Statistics of Cone Responses to Natural Images: Implications for Visual Coding
 Journal of the Optical Society of America A
, 1998
"... ted in the first stage of retinal processing, the photoreceptor layer. In this work we measure the spectral distributions of light present in natural images by using a hyperspectral camera, 1215 which provides a complete spectrum at each pixel. We derive human cone responses at each spatial loc ..."
Abstract

Cited by 97 (2 self)
 Add to MetaCart
ted in the first stage of retinal processing, the photoreceptor layer. In this work we measure the spectral distributions of light present in natural images by using a hyperspectral camera, 1215 which provides a complete spectrum at each pixel. We derive human cone responses at each spatial location from the spectra, and from these we gather cone response statistics for analysis. This approach is related to that of Webster and Mollon, with the key difference that whereas they contrast the differences between various images, we study the ensemble statistics as averaged over images. Our results are qualitatively similar to those of Buchsbaum and Gottschalk, who sought to understand theoretically, by using model stimuli, how the visual system might decorrelate natural cone signals through an orthogonal linear transformation. They found that under certain conditions this can be achieved through a transformation to a luminancelike channel and a pair of blue yellow and redgre
Parts of Visual Form: Computational Aspects
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 1995
"... Underlying recognition is an organization of objects and their parts into classes and hierarchies. A representation of parts for recognition requires that they be invariant to rigid transformations, robust in the presence of occlusions, stable with changes in viewing geometry, and be arranged in a h ..."
Abstract

Cited by 79 (6 self)
 Add to MetaCart
Underlying recognition is an organization of objects and their parts into classes and hierarchies. A representation of parts for recognition requires that they be invariant to rigid transformations, robust in the presence of occlusions, stable with changes in viewing geometry, and be arranged in a hierarchy. These constraints are captured in a general framework using notions of a partline and a partitioning scheme. A proposed general principle of "form from function" motivates a particular partitioning scheme involving two types of parts, neckbased and limbbased, whose psychophysical relevance was demonstrated in [39]. Neckbased parts arise from narrowings in shape, or the local minima in distance between two points on the boundary, while limbbased parts arise from a pair of negative curvature minima which have "cocircular" tangents. In this paper, we present computational support for the limbbased and neckbased parts by showing that they are invariant, robust, stable and yield...
Probabilistic Independent Component Analysis
, 2003
"... Independent Component Analysis is becoming a popular exploratory method for analysing complex data such as that from FMRI experiments. The application of such 'modelfree' methods, however, has been somewhat restricted both by the view that results can be uninterpretable and by the lack of ability t ..."
Abstract

Cited by 74 (12 self)
 Add to MetaCart
Independent Component Analysis is becoming a popular exploratory method for analysing complex data such as that from FMRI experiments. The application of such 'modelfree' methods, however, has been somewhat restricted both by the view that results can be uninterpretable and by the lack of ability to quantify statistical significance. We present an integrated approach to Probabilistic ICA for FMRI data that allows for nonsquare mixing in the presence of Gaussian noise. We employ an objective estimation of the amount of Gaussian noise through Bayesian analysis of the true dimensionality of the data, i.e. the number of activation and nonGaussian noise sources. Reduction of the data to this 'true' subspace before the ICA decomposition automatically results in an estimate of the noise, leading to the ability to assign significance to voxels in ICA spatial maps. Estimation of the number of intrinsic sources not only enables us to carry out probabilistic modelling, but also achieves an asymptotically unique decomposition of the data. This reduces problems of interpretation, as each final independent component is now much more likely to be due to only one physical or physiological process. We also describe other improvements to standard ICA, such as temporal prewhitening and variance normafisation of timeseries, the latter being particularly useful in the context of dimensionality reduction when weak activation is present. We discuss the use of prior information about the spatiotemporal nature of the source processes, and an alternativehypothesis testing approach for inference, using Gaussian mixture models. The performance of our approach is illustrated and evaluated on real and complex artificial FMRI data, and compared to the spatiotemporal accuracy of restfits obtaine...
Nonlocal image and movie denoising
 International Journal of Computer Vision
, 2008
"... Neighborhood filters are nonlocal image and movie filters which reduce the noise by averaging similar pixels. The first object of the paper is to present a unified theory of these filters and reliable criteria to compare them to other filter classes. A CCD noise model will be presented justifying th ..."
Abstract

Cited by 71 (1 self)
 Add to MetaCart
Neighborhood filters are nonlocal image and movie filters which reduce the noise by averaging similar pixels. The first object of the paper is to present a unified theory of these filters and reliable criteria to compare them to other filter classes. A CCD noise model will be presented justifying the involvement of neighborhood filters. A classification of neighborhood filters will be proposed, including classical image and movie denoising methods and discussing further a recently introduced neighborhood filter, NLmeans. In order to compare denoising methods three principles will be discussed. The first principle, “method noise”, specifies that only noise must be removed from an image. A second principle will be introduced, “noise to noise”, according to which a denoising method must transform a white noise into a white noise. Contrarily to “method noise”, this principle, which characterizes artifactfree methods, eliminates any subjectivity and can be checked by mathematical arguments and Fourier analysis. “Noise to noise ” will be proven to rule out most denoising methods, with the exception of neighborhood filters. This is why a third and new comparison principle, the “statistical optimality”, is needed and will be
Shapes, Shocks, and Deformations I: The Components of TwoDimensional Shape and the ReactionDiffusion Space
 International Journal of Computer Vision
, 1994
"... We undertake to develop a general theory of twodimensional shape by elucidating several principles which any such theory should meet. The principles are organized around two basic intuitions: first, if a boundary were changed only slightly, then, in general, its shape would change only slightly. Th ..."
Abstract

Cited by 64 (5 self)
 Add to MetaCart
We undertake to develop a general theory of twodimensional shape by elucidating several principles which any such theory should meet. The principles are organized around two basic intuitions: first, if a boundary were changed only slightly, then, in general, its shape would change only slightly. This leads us to propose an operational theory of shape based on incremental contour deformations. The second intuition is that not all contours are shapes, but rather only those that can enclose "physical" material. A theory of contour deformation is derived from these principles, based on abstract conservation principles and HamiltonJacobi theory. These principles are based on the work of Sethian [82, 86], the OsherSethian level set formulation [65], the classical shock theory of Lax [53, 54], as well as curve evolution theory for a curve evolving as a function of the curvature and the relation to geometric smoothing of GageHamiltonGrayson [32, 37]. The result is a characterization of th...