Results 1  10
of
31
High Accuracy Optical Flow Estimation Based on a Theory for Warping
, 2004
"... We study an energy functional for computing optical flow that combines three assumptions: a brightness constancy assumption, a gradient constancy assumption, and a discontinuitypreserving spatiotemporal smoothness constraint. ..."
Abstract

Cited by 331 (39 self)
 Add to MetaCart
(Show Context)
We study an energy functional for computing optical flow that combines three assumptions: a brightness constancy assumption, a gradient constancy assumption, and a discontinuitypreserving spatiotemporal smoothness constraint.
The Computation of Optical Flow
, 1995
"... Twodimensional image motion is the projection of the threedimensional motion of objects, relative to a visual sensor, onto its image plane. Sequences of timeordered images allow the estimation of projected twodimensional image motion as either instantaneous image velocities or discrete image dis ..."
Abstract

Cited by 235 (10 self)
 Add to MetaCart
(Show Context)
Twodimensional image motion is the projection of the threedimensional motion of objects, relative to a visual sensor, onto its image plane. Sequences of timeordered images allow the estimation of projected twodimensional image motion as either instantaneous image velocities or discrete image displacements. These are usually called the optical flow field or the image velocity field. Provided that optical flow is a reliable approximation to twodimensional image motion, it may then be used to recover the threedimensional motion of the visual sensor (to within a scale factor) and the threedimensional surface structure (shape or relative depth) through assumptions concerning the structure of the optical flow field, the threedimensional environment and the motion of the sensor. Optical flow may also be used to perform motion detection, object segmentation, timetocollision and focus of expansion calculations, motion compensated encoding and stereo disparity measurement. We investiga...
Lucas/Kanade meets Horn/Schunck: Combining local and global optic flow methods
 International Journal of Computer Vision
, 2005
"... Abstract. Differential methods belong to the most widely used techniques for optic flow computation in image sequences. They can be classified into local methods such as the Lucas–Kanade technique or Bigün’s structure tensor method, and into global methods such as the Horn/Schunck approach and its e ..."
Abstract

Cited by 157 (13 self)
 Add to MetaCart
Abstract. Differential methods belong to the most widely used techniques for optic flow computation in image sequences. They can be classified into local methods such as the Lucas–Kanade technique or Bigün’s structure tensor method, and into global methods such as the Horn/Schunck approach and its extensions. Often local methods are more robust under noise, while global techniques yield dense flow fields. The goal of this paper is to contribute to a better understanding and the design of novel differential methods in four ways: (i) We juxtapose the role of smoothing/regularisation processes that are required in local and global differential methods for optic flow computation. (ii) This discussion motivates us to describe and evaluate a novel method that combines important advantages of local and global approaches: It yields dense flow fields that are robust against noise. (iii) Spatiotemporal and nonlinear extensions as well as multiresolution frameworks are presented for this hybrid method. (iv) We propose a simple confidence measure for optic flow methods that minimise energy functionals. It allows to sparsify a dense flow field gradually, depending on the reliability required for the resulting flow. Comparisons with experiments from the literature demonstrate the favourable performance of the proposed methods and the confidence measure.
Reliable Estimation of Dense Optical Flow Fields with Large Displacements
, 2001
"... In this paper we show that a classic optical ow technique by Nagel and Enkelmann (1986) can be regarded as an early anisotropic diusion method with a diusion tensor. We introduce three improvements into the model formulation that (i) avoid inconsistencies caused by centering the brightness term and ..."
Abstract

Cited by 105 (13 self)
 Add to MetaCart
(Show Context)
In this paper we show that a classic optical ow technique by Nagel and Enkelmann (1986) can be regarded as an early anisotropic diusion method with a diusion tensor. We introduce three improvements into the model formulation that (i) avoid inconsistencies caused by centering the brightness term and the smoothness term in dierent images, (ii) use a linear scalespace focusing strategy from coarse to ne scales for avoiding convergence to physically irrelevant local minima, and (iii) create an energy functional that is invariant under linear brightness changes. Applying a gradient descent method to the resulting energy functional leads to a system of diusion{reaction equations. We prove that this system has a unique solution under realistic assumptions on the initial data, and we present an ecient linear implicit numerical scheme in detail. Our method creates ow elds with 100 % density over the entire image domain, it is robust under a large range of parameter variations, and it c...
Computing Optical Flow with Physical Models of Brightness Variation
"... This paper exploits physical models of timevarying brightness in image sequences to estimate optical flow and physical parameters of the scene. Previous approaches handled violations of brightness constancy with the use of robust statistics or with generalized brightness constancy constraints that ..."
Abstract

Cited by 89 (1 self)
 Add to MetaCart
This paper exploits physical models of timevarying brightness in image sequences to estimate optical flow and physical parameters of the scene. Previous approaches handled violations of brightness constancy with the use of robust statistics or with generalized brightness constancy constraints that allow generic types of contrast and illumination changes. Here, we consider models of brightness variation that have timedependent physical causes, namely, changing surface orientation with respect to a directional illuminant, motion of the illuminant, and physical models of heat transport in infrared images. We simultaneously estimate the optical flow and the relevant physical parameters. The estimation problem is formulated using total least squares (TLS), with confidence bounds on the parameters.
A Theoretical Framework for Convex Regularizers in PDEBased Computation of Image Motion
, 2000
"... Many differential methods for the recovery of the optic flow field from an image sequence can be expressed in terms of a variational problem where the optic flow minimizes some energy. Typically, these energy functionals consist of two terms: a data term, which requires e.g. that a brightness consta ..."
Abstract

Cited by 84 (21 self)
 Add to MetaCart
(Show Context)
Many differential methods for the recovery of the optic flow field from an image sequence can be expressed in terms of a variational problem where the optic flow minimizes some energy. Typically, these energy functionals consist of two terms: a data term, which requires e.g. that a brightness constancy assumption holds, and a regularizer that encourages global or piecewise smoothness of the flow field. In this paper we present a systematic classification of rotation invariant convex regularizers by exploring their connection to diffusion filters for multichannel images. This taxonomy provides a unifying framework for datadriven and flowdriven, isotropic and anisotropic, as well as spatial and spatiotemporal regularizers. While some of these techniques are classic methods from the literature, others are derived here for the first time. We prove that all these methods are wellposed: they posses a unique solution that depends in a continuous way on the initial data. An interesting structural relation between isotropic and anisotropic flowdriven regularizers is identified, and a design criterion is proposed for constructing anisotropic flowdriven regularizers in a simple and direct way from isotropic ones. Its use is illustrated by several examples.
Highly accurate optic flow computation with theoretically justified warping
 INTERNATIONAL JOURNAL OF COMPUTER VISION
, 2006
"... ..."
(Show Context)
Bayesian estimation of layers from multiple images
 In Seventh European Conference on Computer Vision (ECCV 2002), volume III
, 2002
"... ..."
(Show Context)
Overparameterized variational optical flow
 International Journal of Computer Vision
"... We introduce a novel optical flow estimation process based on a spatiotemporal model with varying coefficients multiplying a set of basis functions at each pixel. Previous optical flow estimation methodologies did not use such an over parameterized representation of the flow field as the problem is ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
(Show Context)
We introduce a novel optical flow estimation process based on a spatiotemporal model with varying coefficients multiplying a set of basis functions at each pixel. Previous optical flow estimation methodologies did not use such an over parameterized representation of the flow field as the problem is illposed even without introducing any additional parameters: Neighborhood based methods like LucasKanade determine the flow in each pixel by constraining the flow to to be constant in a small area. Modern variational methods represent the optic flow directly via its x and y components at each pixel. The benefit of overparametrization becomes evident in the smoothness term, which instead of directly penalizing for changes in the optic flow, integrates a cost on the deviation from the assumed optic flow model. Previous variational optical flow techniques are special cases of the proposed method, used in conjunction with a constant flow basis function. Experimental results with the novel flow estimation process yielded significant improvements with respect to the best results published so far. 1.
Multiscale 3D scene flow from binocular stereo sequences
 In WACV/MOTION
, 2005
"... Scene flow methods estimate the threedimensional motion field for points in the world, using multicamera video data. Such methods combine multiview reconstruction with motion estimation. This paper describes an alternative formulation for dense scene flow estimation that provides reliable results ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
Scene flow methods estimate the threedimensional motion field for points in the world, using multicamera video data. Such methods combine multiview reconstruction with motion estimation. This paper describes an alternative formulation for dense scene flow estimation that provides reliable results using only two cameras by fusing stereo and optical flow estimation into a single coherent framework. Internally, the proposed algorithm generates probability distributions for optical flow and disparity. Taking into account the uncertainty in the intermediate stages allows for more reliable estimation of the 3D scene flow than previous methods allow. To handle the aperture problems inherent in the estimation of optical flow and disparity, a multiscale method along with a novel regionbased technique is used within a regularized solution. This combined approach both preserves discontinuities and prevents overregularization – two problems commonly associated with the basic multiscale approaches. Experiments with synthetic and real test data demonstrate the strength of the proposed approach.