Results 1  10
of
73
Performance of optical flow techniques
 INTERNATIONAL JOURNAL OF COMPUTER VISION
, 1994
"... While different optical flow techniques continue to appear, there has been a lack of quantitative evaluation of existing methods. For a common set of real and synthetic image sequences, we report the results of a number of regularly cited optical flow techniques, including instances of differential, ..."
Abstract

Cited by 1045 (32 self)
 Add to MetaCart
While different optical flow techniques continue to appear, there has been a lack of quantitative evaluation of existing methods. For a common set of real and synthetic image sequences, we report the results of a number of regularly cited optical flow techniques, including instances of differential, matching, energybased and phasebased methods. Our comparisons are primarily empirical, and concentrate on the accuracy, reliability and density of the velocity measurements; they show that performance can differ significantly among the techniques we implemented.
The Computation of Optical Flow
, 1995
"... Twodimensional image motion is the projection of the threedimensional motion of objects, relative to a visual sensor, onto its image plane. Sequences of timeordered images allow the estimation of projected twodimensional image motion as either instantaneous image velocities or discrete image dis ..."
Abstract

Cited by 216 (10 self)
 Add to MetaCart
Twodimensional image motion is the projection of the threedimensional motion of objects, relative to a visual sensor, onto its image plane. Sequences of timeordered images allow the estimation of projected twodimensional image motion as either instantaneous image velocities or discrete image displacements. These are usually called the optical flow field or the image velocity field. Provided that optical flow is a reliable approximation to twodimensional image motion, it may then be used to recover the threedimensional motion of the visual sensor (to within a scale factor) and the threedimensional surface structure (shape or relative depth) through assumptions concerning the structure of the optical flow field, the threedimensional environment and the motion of the sensor. Optical flow may also be used to perform motion detection, object segmentation, timetocollision and focus of expansion calculations, motion compensated encoding and stereo disparity measurement. We investiga...
Mixture Models for Optical Flow Computation
, 1993
"... The computahon of optical flow rehes on merg. ,ng znformat,on avadable over an zmage patch to form an estimate of D mage veloct!t at a point. Ths merging process rases a host of ssues, which include the treatment of outhers m component ve !oc*t!t measurements and the modehng of mulhple motions wath ..."
Abstract

Cited by 146 (16 self)
 Add to MetaCart
The computahon of optical flow rehes on merg. ,ng znformat,on avadable over an zmage patch to form an estimate of D mage veloct!t at a point. Ths merging process rases a host of ssues, which include the treatment of outhers m component ve !oc*t!t measurements and the modehng of mulhple motions wathm a patch whzch arse from occlusion boundaries or transparency. We present a new ap proach for deahno wth these ssues. which s based Proc. CVPR'93, New York, June 1993 2 a c Figure 2: Multiple motion constraint lines for the region in Figure I (see text).
PhaseBased Disparity Measurement
 CVGIP: Image Understanding
, 1991
"... The measurement of image disparity is a fundamental precursor to binocular depth estimation. Recently, Jenkin and Jepson (1988) and Sanger (1988) described promising methods based on the output phase behaviour of bandpass Gabor filters. Here we discuss further justification for such techniques base ..."
Abstract

Cited by 93 (6 self)
 Add to MetaCart
The measurement of image disparity is a fundamental precursor to binocular depth estimation. Recently, Jenkin and Jepson (1988) and Sanger (1988) described promising methods based on the output phase behaviour of bandpass Gabor filters. Here we discuss further justification for such techniques based on the stability of bandpass phase behaviour as a function of typical distortions that exist between left and right views. In addition, despite this general stability, we show that phase signals are occasionally very sensitive to spatial position and variations in scale, in which cases incorrect measurements occur. We find that the primary cause for this instability is the existence of singularities in phase signals. With the aid of the local frequency of the filter output (provided by the phase derivative) and the local amplitude information, the regions of phase instability near the singularities are detected so that potentially incorrect measurements can be identified. In addition, we ...
Viewbased interpretation of realtime optical flow for gesture recognition
, 1998
"... We have developed a realtime, viewbased gesture recognition system. Optical flow is estimated and segmented into motion blobs. Gestures are recognized using a rulebased technique based on characteristics of the motion blobs such as relative motion and size. Parameters of the gesture (e.g., freque ..."
Abstract

Cited by 88 (11 self)
 Add to MetaCart
We have developed a realtime, viewbased gesture recognition system. Optical flow is estimated and segmented into motion blobs. Gestures are recognized using a rulebased technique based on characteristics of the motion blobs such as relative motion and size. Parameters of the gesture (e.g., frequency) are then estimated using context specific techniques. The system has been applied to create an interactive environment for children. 1
A Tensor Framework for Multidimensional Signal Processing
 Linkoping University, Sweden
, 1994
"... ii About the cover The figure on the cover shows a visualization of a symmetric tensor in three dimensions, G = λ1ê1ê T 1 + λ2ê2ê T 2 + λ3ê3ê T 3 The object in the figure is the sum of a spear, a plate and a sphere. The spear describes the principal direction of the tensor λ1ê1ê T 1, where the lengt ..."
Abstract

Cited by 53 (8 self)
 Add to MetaCart
ii About the cover The figure on the cover shows a visualization of a symmetric tensor in three dimensions, G = λ1ê1ê T 1 + λ2ê2ê T 2 + λ3ê3ê T 3 The object in the figure is the sum of a spear, a plate and a sphere. The spear describes the principal direction of the tensor λ1ê1ê T 1, where the length is proportional to the largest eigenvalue, λ1. The plate describes the plane spanned by the eigenvectors corresponding to the two largest eigenvalues, λ2(ê1ê T 1 + ê2ê T 2). The sphere, with a radius proportional to the smallest eigenvalue, shows how isotropic the tensor is, λ3(ê1ê T 1 + ê2ê T 2 + ê3ê T 3). The visualization is done using AVS [WWW94]. I am very grateful to Johan Wiklund for implementing the tensor viewer module used. This thesis deals with filtering of multidimensional signals. A large part of the thesis is devoted to a novel filtering method termed “Normalized convolution”. The method performs local expansion of a signal in a chosen filter basis which
Probabilistic Detection and Tracking of Motion Boundaries
, 2000
"... We propose a Bayesian framework for representing and recognizing local image motion in terms of two basic models: translational motion and motion boundaries. Motion boundaries are represented using a nonlinear generative model that explicitly encodes the orientation of the boundary, the velocities ..."
Abstract

Cited by 52 (2 self)
 Add to MetaCart
We propose a Bayesian framework for representing and recognizing local image motion in terms of two basic models: translational motion and motion boundaries. Motion boundaries are represented using a nonlinear generative model that explicitly encodes the orientation of the boundary, the velocities on either side, the motion of the occluding edge over time, and the appearance/disappearance of pixels at the boundary. We represent the posterior probability distribution over the model parameters given the image data using discrete samples. This distribution is propagated over time using a particle filtering algorithm. To efficiently represent such a highdimensional space we initialize samples using the responses of a lowlevel motion discontinuity detector. The formulation and computational model provide a general probabilistic framework for motion estimation with multiple, nonlinear, models.
Dense Image Registration through MRFs and Efficient Linear Programming
, 2008
"... In this paper we introduce a novel and efficient approach to dense image registration, which does not require a derivative of the employed cost function. In such a context the registration problem is formulated using a discrete Markov Random Field objective function. First, towards dimensionality re ..."
Abstract

Cited by 42 (28 self)
 Add to MetaCart
In this paper we introduce a novel and efficient approach to dense image registration, which does not require a derivative of the employed cost function. In such a context the registration problem is formulated using a discrete Markov Random Field objective function. First, towards dimensionality reduction on the variables we assume that the dense deformation field can be expressed using a small number of control points (registration grid) and an interpolation strategy. Then, the registration cost is expressed using a discrete sum over image costs (using an arbitrary similarity measure) projected on the control points, and a smoothness term that penalizes local deviations on the deformation field according to a neighborhood system on the grid. Towards a discrete approach the search space is quantized resulting in a fully discrete model. In order to account for large deformations and produce results on a high resolution level a multiscale incremental approach is considered where the optimal solution is iteratively updated. This is done through successive morphings of the source towards the target image. Efficient linear programming using the primal dual principles is considered to recover the lowest potential of the cost function. Very promising results using synthetic data with known deformations and real data demonstrate the potentials of our approach.
Linear subspace methods for recovering translation direction. Spatial Vision in Humans and Robots
, 1993
"... The image motion eld for an observer moving through a static environment depends on the observer's translational and rotational velocities along with the distances to surface points. Given such a motion eld as input we haverecently introduced subspace methods for the recovery of the observer's motio ..."
Abstract

Cited by 39 (1 self)
 Add to MetaCart
The image motion eld for an observer moving through a static environment depends on the observer's translational and rotational velocities along with the distances to surface points. Given such a motion eld as input we haverecently introduced subspace methods for the recovery of the observer's motion and the depth structure of the scene. This class of methods involve splitting the equations describing the motion eld into separate equations for the observer's translational direction, the rotational velocity, and the relative depths. The resulting equations can then be solved successively, beginning with the equations for the translational direction. Here we concentrate on this rst step. In earlier work, a linear method was shown to provide a biased estimate of the translational direction. We discuss the source of this bias and show howit can be e ectively removed. The consequence is that the observer's velocity and the relative depths to points in the scene can all be recovered by successively solving three linear problems.
Recursive Filters for Optical Flow
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 1995
"... : Working toward ecient (realtime) implementations of optical ow methods, we have applied simple recursive lters to achieve temporal smoothing and dierentiation of image intensity, and to compute 2d ow from component velocity constraints using spatiotemporal leastsquares minimization. Accuracy in ..."
Abstract

Cited by 38 (1 self)
 Add to MetaCart
: Working toward ecient (realtime) implementations of optical ow methods, we have applied simple recursive lters to achieve temporal smoothing and dierentiation of image intensity, and to compute 2d ow from component velocity constraints using spatiotemporal leastsquares minimization. Accuracy in simulation is similar to that obtained in the study by Barron et al. [3], while requiring much less storage, less computation, and shorter delays. 1 Introduction Many methods exist for computing optic ow, but few currently run at frame rates on reasonably priced, conventional hardware. The goal of this paper is to outline simplications to a successful gradientbased approach that reduce computational expense with little degradation in accuracy. Our specic concerns include temporal smoothing and dierentiation of image intensity, and temporal integration of component velocity constraints to solve for 2d velocity. More generally, we are working toward ecient implementations of dierent...