Results 1  10
of
57
On the fitting of surfaces to data with covariances
 IEEE Trans. Patt. Anal. Mach. Intell
, 2000
"... AbstractÐWe consider the problem of estimating parameters of a model described by an equation of special form. Specific models arise in the analysis of a wide class of computer vision problems, including conic fitting and estimation of the fundamental matrix. We assume that noisy data are accompanie ..."
Abstract

Cited by 59 (16 self)
 Add to MetaCart
AbstractÐWe consider the problem of estimating parameters of a model described by an equation of special form. Specific models arise in the analysis of a wide class of computer vision problems, including conic fitting and estimation of the fundamental matrix. We assume that noisy data are accompanied by (known) covariance matrices characterizing the uncertainty of the measurements. A cost function is first obtained by considering a maximumlikelihood formulation and applying certain necessary approximations that render the problem tractable. A novel, Newtonlike iterative scheme is then generated for determining a minimizer of the cost function. Unlike alternative approaches such as Sampson's method or the renormalization technique, the new scheme has as its theoretical limit the minimizer of the cost function. Furthermore, the scheme is simply expressed, efficient, and unsurpassed as a general technique in our testing. An important feature of the method is that it can serve as a basis for conducting theoretical comparison of various estimation approaches.
An Information Fusion Framework for Robust Shape Tracking
, 2005
"... Existing methods for incorporating subspace model constraints in shape tracking use only partial information from the measurements and model distribution. We propose a unified framework for robust shape tracking, optimally fusing heteroscedastic uncertainties or noise from measurement, system dynam ..."
Abstract

Cited by 44 (8 self)
 Add to MetaCart
Existing methods for incorporating subspace model constraints in shape tracking use only partial information from the measurements and model distribution. We propose a unified framework for robust shape tracking, optimally fusing heteroscedastic uncertainties or noise from measurement, system dynamics, and a subspace model. The resulting nonorthogonal subspace projection and fusion are natural extensions of the traditional model constraint using orthogonal projection. We present two motion measurement algorithms and introduce alternative solutions for measurement uncertainty estimation. We build shape models offline from training data and exploit information from the ground truth initialization online through a strong model adaptation. Our framework is applied for tracking in echocardiograms where the motion estimation errors are heteroscedastic in nature, each heart has a distinct shape, and the relative motions of epicardial and endocardial borders reveal crucial diagnostic features. The proposed method significantly outperforms the existing shapespaceconstrained tracking algorithm. Due to the complete treatment of heteroscedastic uncertainties, the strong model adaptation, and the coupled tracking of doublecontours, robust performance is observed even on the most challenging cases.
Robust Regression with Projection Based Mestimators
 In International Conference on Computer Vision
, 2003
"... The robust regression techniques in the RANSAC family are popular today in computer vision, but their performance depends on a user supplied threshold. We eliminate this drawback of RANSAC by reformulating another robust method, the Mestimator, as a projection pursuit optimization problem. The proj ..."
Abstract

Cited by 32 (7 self)
 Add to MetaCart
The robust regression techniques in the RANSAC family are popular today in computer vision, but their performance depends on a user supplied threshold. We eliminate this drawback of RANSAC by reformulating another robust method, the Mestimator, as a projection pursuit optimization problem. The projection based pbMestimator automatically derives the threshold from univariate kernel density estimates. Nevertheless, the performance of the pbMestimator equals or exceeds that of RANSAC techniques tuned to the optimal threshold, a value which is never available in practice. Experiments were performed both with synthetic and real data in the affine motion and fundamental matrix estimation tasks.
Estimation of nonlinear errorsinvariables models for computer vision applications
 IEEE Trans. Patt. Anal. Mach. Intell
, 2006
"... Abstract—In an errorsinvariables (EIV) model, all the measurements are corrupted by noise. The class of EIV models with constraints separable into the product of two nonlinear functions, one solely in the variables and one solely in the parameters, is general enough to represent most computer visi ..."
Abstract

Cited by 24 (4 self)
 Add to MetaCart
Abstract—In an errorsinvariables (EIV) model, all the measurements are corrupted by noise. The class of EIV models with constraints separable into the product of two nonlinear functions, one solely in the variables and one solely in the parameters, is general enough to represent most computer vision problems. We show that the estimation of such nonlinear EIV models can be reduced to iteratively estimating a linear model having point dependent, i.e., heteroscedastic, noise process. Particular cases of the proposed heteroscedastic errorsinvariables (HEIV) estimator are related to other techniques described in the vision literature: the Sampson method, renormalization, and the fundamental numerical scheme. In a wide variety of tasks, the HEIV estimator exhibits the same, or superior, performance as these techniques and has a weaker dependence on the quality of the initial solution than the LevenbergMarquardt method, the standard approach toward estimating nonlinear models. Index Terms—Nonlinear least squares, heteroscedastic regression, camera calibration, 3D rigid motion, uncalibrated vision. 1 MODELING COMPUTER VISION PROBLEMS SOLVING most computer vision problems requires the estimation of a set of parameters from noisy measurements using a statistical model. A statistical model provides a mathematical description of a problem in terms of a constraint equation relating the measurements to the
Robust Regression for Data with Multiple Structures
 In 2001 IEEE Conference on Computer Vision and Pattern Recognition, volume I
, 2001
"... In many vision problems (e.g., stereo, motion) multiple structures can occur in the data, in which case several instances of the same model need to be recovered from a single data set. However, once the measurement noise becomes significantly large relative to the separation between the structures, ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
In many vision problems (e.g., stereo, motion) multiple structures can occur in the data, in which case several instances of the same model need to be recovered from a single data set. However, once the measurement noise becomes significantly large relative to the separation between the structures, the robust statistical methods commonly used in the vision community tend to fail. In this paper, we show that all these techniques are special cases of the general class of Mestimators with auxiliary scale, and explain their failure in the presence of noisy multiple structures. To be able to cope with data containing multiple structures the techniques innate to vision (Hough and RANSAC) should be combined with the robust methods customary in statistics. The implications of our analysis are illustrated by introducing a simple procedure for 2D multistructured data problematic for all known current techniques. 1.
Statistical efficiency of curve fitting algorithms
 Computational Statistics and Data Analysis
, 2004
"... We study the problem of fitting parametrized curves to noisy data. Under certain assumptions (known as Cartesian and radial functional models), we derive asymptotic expressions for the bias and the covariance matrix of the parameter estimates. We also extend Kanatani’s version of the CramerRao lowe ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
We study the problem of fitting parametrized curves to noisy data. Under certain assumptions (known as Cartesian and radial functional models), we derive asymptotic expressions for the bias and the covariance matrix of the parameter estimates. We also extend Kanatani’s version of the CramerRao lower bound, which he proved for unbiased estimates only, to more general estimates that include many popular algorithms (most notably, the orthogonal least squares and algebraic fits). We then show that the gradientweighted algebraic fit is statistically efficient and describe all other statistically efficient algebraic fits.
From FNS to HEIV: A Link between Two Vision Parameter Estimation Methods
 IEEE Trans. Pattern Anal. Mach. Intell
, 2004
"... Abstract — Problems requiring accurate determination of parameters from imagebased quantities arise often in computer vision. Two recent, independently developed frameworks for estimating such parameters are the FNS and HEIV schemes. Here, it is shown that FNS and a core version of HEIV are essenti ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
Abstract — Problems requiring accurate determination of parameters from imagebased quantities arise often in computer vision. Two recent, independently developed frameworks for estimating such parameters are the FNS and HEIV schemes. Here, it is shown that FNS and a core version of HEIV are essentially equivalent, solving a common underlying equation via different means. The analysis is driven by the search for a nondegenerate form of a certain generalized eigenvalue problem, and effectively leads to a new derivation of the relevant case of the HEIV algorithm. This work may be seen as an extension of previous efforts to rationalize and interrelate a spectrum of estimators, including the renormalization method of Kanatani and the normalized eightpoint method of Hartley. Index Terms — Statistical methods, maximum likelihood, (un)constrained minimization, fundamental matrix, epipolar equation I.
A New Constrained Parameter Estimator For Computer Vision Applications
"... Previous work of the authors developed a theoretically wellfounded scheme (FNS) for finding the minimiser of a class of cost functions. Various problems in video analysis, stereo vision, ellipsefitting, etc, may be expressed in terms of finding such a minimiser. However, in common with many other ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
Previous work of the authors developed a theoretically wellfounded scheme (FNS) for finding the minimiser of a class of cost functions. Various problems in video analysis, stereo vision, ellipsefitting, etc, may be expressed in terms of finding such a minimiser. However, in common with many other approaches, it is necessary to correct the minimiser as a postprocess if an ancillary constraint is also to be satisfied. In this paper we develop the first integrated scheme (CFNS) for simultaneously minimising the cost function and satisfying the constraint. Preliminary experiments in the domain of fundamentalmatrix estimation show that CFNS generates rank2 estimates with smaller cost function values than rank2 corrected FNS estimates. Furthermore, when compared with the HartleyZisserman Gold Standard method, CFNS is seen to generate results of comparable quality in a fraction of the time.
High accuracy fundamental matrix computation and its performance evaluation
 Proc. 17th British Machine Vision Conf (BMVC 2006), vol.1
, 2006
"... We compare the convergence performance of different numerical schemes for computing the fundamental matrix from point correspondences over two images. First, we state the problem and the associated KCR lower bound. Then, we describe the algorithms of three wellknown methods: FNS, HEIV, and renormal ..."
Abstract

Cited by 14 (10 self)
 Add to MetaCart
We compare the convergence performance of different numerical schemes for computing the fundamental matrix from point correspondences over two images. First, we state the problem and the associated KCR lower bound. Then, we describe the algorithms of three wellknown methods: FNS, HEIV, and renormalization, to which we add GaussNewton iterations. For initial values, we test random choice, least squares, and Taubin’s method. Experiments using simulated and real images reveal different characteristics of each method. Overall, FNS exhibits the best convergence performance. 1
What value covariance information in estimating vision parameters
 In Eighth IEEE International Conference on Computer Vision (ICCV 2001
, 2001
"... Many parameter estimation methods used in computer vision are able to utilise covariance information describing the uncertainty of data measurements. This paper considers the value of this information to the estimation process when applied to measured image point locations. Covariance matrices are f ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
Many parameter estimation methods used in computer vision are able to utilise covariance information describing the uncertainty of data measurements. This paper considers the value of this information to the estimation process when applied to measured image point locations. Covariance matrices are first described and a procedure is then outlined whereby covariances may be associated with image features located via a measurement process. An empirical study is made of the conditions under which covariance information enables generation of improved parameter estimates. Also explored is the extent to which the noise should be anisotropic and inhomogeneous if improvements are to be obtained over covariancefree methods. Critical in this is the devising of synthetic experiments under which noise conditions can be precisely controlled. Given that covariance information is, in itself, subject to estimation error, tests are also undertaken to determine the impact of imprecise covariance information upon the quality of parameter estimates. Finally, an experiment is carried out to assess the value of covariances in estimating the fundamental matrix from real images. 1.