Results 1  10
of
13
Shape modeling with front propagation: A level set approach
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 1995
"... Abstract Shape modeling is an important constituent of computer vision as well as computer graphics research. Shape models aid the tasks of object representation and recognition. This paper presents a new approach to shape modeling which retains some of the attractive features of existing methods ..."
Abstract

Cited by 631 (17 self)
 Add to MetaCart
Abstract Shape modeling is an important constituent of computer vision as well as computer graphics research. Shape models aid the tasks of object representation and recognition. This paper presents a new approach to shape modeling which retains some of the attractive features of existing methods and overcomes some of their limitations. Our techniques can be applied to model arbitrarily complex shapes, which include shapes with significant protrusions, and to situations where no a priori assumption about the object’s topology is made. A single instance of our model, when presented with an image having more than one object of interest, has the ability to split freely to represent each object. This method is based on the ideas developed by Osher and Sethian to model propagating solidhiquid interfaces with curvaturedependent speeds. The interface (front) is a closed, nonintersecting, hypersurface flowing along its gradient field with constant speed or a speed that depends on the curvature. It is moved by solving a “HamiltonJacob? ’ type equation written for a function in which the interface is a particular level set. A speed term synthesizpd from the image is used to stop the interface in the vicinity of object boundaries. The resulting equation of motion is solved by employing entropysatisfying upwind finite difference schemes. We present a variety of ways of computing evolving front, including narrow bands, reinitializations, and different stopping criteria. The efficacy of the scheme is demonstrated with numerical experiments on some synthesized images and some low contrast medical images. Index Terms Shape modeling, shape recovery, interface motion, level sets, hyperbolic conservation laws, HamiltonJacobi
Constructing Simple Stable Descriptions for Image Partitioning
, 1994
"... A new formulation of the image partitioning problem is presented: construct a complete and stable description of an image, in terms of a specified descriptive language, that is simplest in the sense of being shortest. We show that a descriptive language limited to a loworder polynomial description ..."
Abstract

Cited by 223 (5 self)
 Add to MetaCart
A new formulation of the image partitioning problem is presented: construct a complete and stable description of an image, in terms of a specified descriptive language, that is simplest in the sense of being shortest. We show that a descriptive language limited to a loworder polynomial description of the intensity variation within each region and a chaincodelike description of the region boundaries yields intuitively satisfying partitions for a wide class of images. The advantage of this formulation is that it can be extended to deal with subsequent steps of the imageunderstanding problem (or to deal with other image attributes, such as texture) in a natural way by augmenting the descriptive language. Experiments performed on a variety of both real and synthetic images demonstrate the superior performance of this approach over partitioning techniques based on clustering vectors of local image attributes and standard edgedetection techniques. 1 Introduction The partitioning proble...
A Fast Level Set based Algorithm for TopologyIndependent Shape Modeling
 Journal of Mathematical Imaging and Vision, special issue on Topology and
"... Shape modeling is an important constituent of computer vision as well as computer graphics research. Shape models aid the tasks of object representation and recognition. This paper presents a new approach to shape modeling which retains the most attractive features of existing methods, and overco ..."
Abstract

Cited by 33 (1 self)
 Add to MetaCart
Shape modeling is an important constituent of computer vision as well as computer graphics research. Shape models aid the tasks of object representation and recognition. This paper presents a new approach to shape modeling which retains the most attractive features of existing methods, and overcomes their prominent limitations. Our technique can be applied to model arbitrarily complex shapes, which include shapes with significant protrusions, and to situations where no a priori assumption about the object's topology is made. A single instance of our model, when presented with an image having more than one object of interest, has the ability to split freely to represent each object. This method is based on the ideas developed by Osher & Sethian to model propagating solid/liquid interfaces with curvaturedependent speeds. The interface (front) is a closed, nonintersecting, hypersurface flowing along its gradient field with constant speed or a speed that depends on the curvature...
Optimal Control of Flow With Discontinuities
 Journal of Computational Physics
, 2003
"... Optimal control of the 1D Riemann problem of Euler equations whose solution is characterized by discontinuities is carried out by both nonsmooth and smooth op timization methods. By matching a desired flow to the numerical model for a given time window we effectively change the location of discont ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
Optimal control of the 1D Riemann problem of Euler equations whose solution is characterized by discontinuities is carried out by both nonsmooth and smooth op timization methods. By matching a desired flow to the numerical model for a given time window we effectively change the location of discontinuities. The control pa rameters are chosen to be the initial values for pressure and density fields. Existence of solutions for the optimal control problem is proven. A high resolution model and a model with artificial viscosity, implementing two different numerical methods, are used to solve the forward model. The cost functional is the weighted difference be tween the numerical values and the observations for pressure, density and velocity. The observations are constructed from the analytical solution. We consider either distributed observations in time or observations calculated at the end of the assimi lation window. We consider two different time horizons and two sets of observations. The gradient (respectively a subgradient) of the cost functional, obtained from the adjoint of the discrete forward model, are employed for the smooth minimization (respectively for the nonsmooth minimization) algorithm. Discontinuity detection improves the performance of the minimizer for the model with artificial viscosity by selecting the points where the shock occurs (and these points are then removed from Preprint submitted to Elsevier Science 26 March 2002 the cost functional and its gradient). The numerical flow obtained with the optimal initial conditions obtained from the nonsmooth minimization matches very well the observations. The algorithm for smooth minimization converges for the shorter time horizon but fails to perform satisfactorily for the longer time horizon.
Shape from Darkness Under Error
, 1996
"... This work presents a new and robust linear programmingbased method for recovering surface information from the shadows that an object casts (on itself or elsewhere), even when the input imagery is errorful and noisy. We review the four basic constraints that arise in binary shadow imagery, and we d ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
This work presents a new and robust linear programmingbased method for recovering surface information from the shadows that an object casts (on itself or elsewhere), even when the input imagery is errorful and noisy. We review the four basic constraints that arise in binary shadow imagery, and we demonstrate that noise leads to inconsistent constraints which cause infinite cycles of constraint propagation. We classify these noise errors into mutual shadowers, mutual failing shadowers, and singlepoint inconsistencies, and we show that they tend to grow in number as imagery increases; further, we show that finding a consistent subset of "good" pixels is an NPcomplete problem. We therefore reformulate the problem (in one dimension) as a modified version of a linear programming task. We recast the constraints as linear inequalities, and a new class of variables shift variables is used to warp the imagery. By careful constructio...
Optimal Local Weighted Averaging Methods in Contour Smoothing
, 1997
"... In several applications where binary contours are used to represent and classify patterns, smoothing must be performed to attenuate noise and quantization error. This is often implemented with local weighted averaging of contour point coordinates, because of the simplicity, lowcost and effectivenes ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
In several applications where binary contours are used to represent and classify patterns, smoothing must be performed to attenuate noise and quantization error. This is often implemented with local weighted averaging of contour point coordinates, because of the simplicity, lowcost and effectiveness of such methods. Invoking the `optimality' of the Gaussian filter, many authors will use Gaussianderived weights. But generally these filters are not optimal, and there has been little theoretical investigation of local weighted averaging methods per se. This paper focusses on the direct derivation of optimal local weighted averaging methods tailored towards specific computational goals such as the accurate estimation of contour point positions, tangent slopes, or deviation angles. A new and simple digitization noise model is proposed to derive the best set of weights for different window sizes, for each computational task. Estimates of the fraction of the noise actually removed by these ...
The Viterbi Optimal RunlengthConstrained Approximation Nonlinear Filter
, 1995
"... Simple nonlinear filters are often used to enforce "hard" syntactic constraints while remaining close to the observation data; e.g., in the binary case it is common practice to employ iterations of a suitable median, or a onepass recursive median, openclose, or closopen filter to impose a minimum s ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
Simple nonlinear filters are often used to enforce "hard" syntactic constraints while remaining close to the observation data; e.g., in the binary case it is common practice to employ iterations of a suitable median, or a onepass recursive median, openclose, or closopen filter to impose a minimum symbol runlength constraint while remaining "faithful" to the observation. Unfortunately, these filters are  in general  suboptimal. Motivated by this observation, we pose the following optimization: Given a finitealphabet sequence of finite extent, y = fy(n)g N \Gamma1 n=0 , find a sequence, b x = fbx(n)g N \Gamma1 n=0 , which minimizes d(x; y) = P N \Gamma1 n=0 dn (y(n); x(n)) subject to: x is piecewise constant of plateau runlength M . We show how a suitable reformulation of the problem naturally leads to a simple and efficient Viterbitype optimal algorithmic solution. We call the resulting nonlinear inputoutput operator the Viterbi Optimal RunlengthConstrained Approximation...
A Trajectory Observer for CameraBased Underwater Motion Measurements
, 2004
"... This paper deals with the issue of estimating the trajectory of a vehicle or object moving underwater based on camera measurements. The proposed approach consists of a diffusionbased trajectory observer [1] processing whole segments of a trajectory at a time. Additionally, the observer contains a T ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper deals with the issue of estimating the trajectory of a vehicle or object moving underwater based on camera measurements. The proposed approach consists of a diffusionbased trajectory observer [1] processing whole segments of a trajectory at a time. Additionally, the observer contains a Tikhonov regularizer for smoothing the estimates. Then, a method for including the camera measurements in an appropriate manner is proposed.
An Experimental Investigation in the Use of Robust Statistics in Edge Detection
, 1993
"... Edge detection is one of the fundamental algorithms in computer vision and image processing. In edge detection, abrupt intensity changes are detected at boundaries of regions and these changes are characterized into an edge map. Many edge detection schemes have been studied during the last 30 years ..."
Abstract
 Add to MetaCart
Edge detection is one of the fundamental algorithms in computer vision and image processing. In edge detection, abrupt intensity changes are detected at boundaries of regions and these changes are characterized into an edge map. Many edge detection schemes have been studied during the last 30 years. A fundamental conflict exists between the two performance requirements in edge detection: noise immunity and accurate localization. Most edge detection schemes assume that noise has a Gaussian distribution and the leastsumofsquares (LSS) method is used widely. In the presence of outliers, data which do not conform to any known models of noise, the performance of these detectors degrades significantly. Another issue is the lack of methods for quantitative performance evaluation and comparison algorithms. Qualitative analysis based on human evaluation with no clearlystated standard is often used as a means to judge the goodness of an edge detector. This thesis addresses the above two is...
2 Optimal control of flow with discontinuities
, 2003
"... 7 Optimal control of the 1D Riemann problem of Euler equations is studied, with the initial values for pressure and 8 density as control parameters. The leastsquares type cost functional employs either distributed observations in time or 9 observations calculated at the end of the assimilation win ..."
Abstract
 Add to MetaCart
7 Optimal control of the 1D Riemann problem of Euler equations is studied, with the initial values for pressure and 8 density as control parameters. The leastsquares type cost functional employs either distributed observations in time or 9 observations calculated at the end of the assimilation window. Existence of solutions for the optimal control problem is 10 proven. Smooth and nonsmooth optimization methods employ the numerical gradient (respectively, a subgradient) of 11 the cost functional, obtained from the adjoint of the discrete forward model. The numerical flow obtained with the 12 optimal initial conditions obtained from the nonsmooth minimization matches very well with the observations. The 13 algorithm for smooth minimization converges for the shorter time horizon but fails to perform satisfactorily for the 14 longer time horizon, except when the observations corresponding to shocks are detected and removed.