Results 1 
9 of
9
A New Constrained Parameter Estimator For Computer Vision Applications
"... Previous work of the authors developed a theoretically wellfounded scheme (FNS) for finding the minimiser of a class of cost functions. Various problems in video analysis, stereo vision, ellipsefitting, etc, may be expressed in terms of finding such a minimiser. However, in common with many other ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
Previous work of the authors developed a theoretically wellfounded scheme (FNS) for finding the minimiser of a class of cost functions. Various problems in video analysis, stereo vision, ellipsefitting, etc, may be expressed in terms of finding such a minimiser. However, in common with many other approaches, it is necessary to correct the minimiser as a postprocess if an ancillary constraint is also to be satisfied. In this paper we develop the first integrated scheme (CFNS) for simultaneously minimising the cost function and satisfying the constraint. Preliminary experiments in the domain of fundamentalmatrix estimation show that CFNS generates rank2 estimates with smaller cost function values than rank2 corrected FNS estimates. Furthermore, when compared with the HartleyZisserman Gold Standard method, CFNS is seen to generate results of comparable quality in a fraction of the time.
Statistical optimization for geometric fitting: Theoretical accuracy analysis and high order error analysis
 Int. J. Comput. Vis
, 2008
"... A rigorous accuracy analysis is given to various techniques for estimating parameters of geometric models from noisy data for computer vision applications. First, it is pointed out that parameter estimation for vision applications is very different in nature from traditional statistical analysis and ..."
Abstract

Cited by 14 (8 self)
 Add to MetaCart
A rigorous accuracy analysis is given to various techniques for estimating parameters of geometric models from noisy data for computer vision applications. First, it is pointed out that parameter estimation for vision applications is very different in nature from traditional statistical analysis and hence a different mathematical framework is necessary in such a domain. After general theories on estimation and accuracy are given, typical existing techniques are selected, and their accuracy is evaluated up to higher order terms. This leads to a “hyperaccurate ” method that outperforms existing methods. 1.
Reasoning with uncertain points, straight lines, and straight line segments
 in 2D. ISPRS Journal of Photogrammetry and Remote Sensing 64
, 2009
"... Decisions based on basic geometric entities can only be optimal, if their uncertainty is propagated trough the entire reasoning chain. This concerns the construction of new entities from given ones, the testing of geometric relations between geometric entities, and the parameter estimation of geomet ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Decisions based on basic geometric entities can only be optimal, if their uncertainty is propagated trough the entire reasoning chain. This concerns the construction of new entities from given ones, the testing of geometric relations between geometric entities, and the parameter estimation of geometric entities based on spatial relations which have been found to hold. Basic feature extraction procedures often provide measures of uncertainty. These uncertainties should be incorporated into the representation of geometric entities permitting statistical testing, eliminates the necessity of specifying noninterpretable thresholds and enables statistically optimal parameter estimation. Using the calculus of homogeneous coordinates the power of algebraic projective geometry can be exploited in these steps of image analysis. This review collects, discusses and evaluates the various representations of uncertain Preprint submitted to Elsevier 23 July 2009geometric entities in 2D together with their conversions. The representations are extended to achieve a consistent set of representations allowing geometric reasoning. The statistical testing of geometric relations is presented. Furthermore, a generic estimation procedure is provided for multiple uncertain geometric entities based on possibly correlated observed geometric entities and geometric constraints. Key words: spatial reasoning, uncertainty, homogeneous coordinates, geometric entities
A New Approach to Constrained Parameter Estimation Applicable to Some Computer Vision Problems
"... Previous work of the authors developed a theoretically wellfounded scheme (FNS) for finding the minimiser of a class of cost functions. Various problems in video analysis, stereo vision, ellipsefitting, etc, may be expressed in terms of finding such a minimiser. However, in common with many other ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
Previous work of the authors developed a theoretically wellfounded scheme (FNS) for finding the minimiser of a class of cost functions. Various problems in video analysis, stereo vision, ellipsefitting, etc, may be expressed in terms of finding such a minimiser. However, in common with many other approaches, it is necessary to correct the minimiser as a postprocess if an ancillary constraint is also to be satisfied. In this paper we develop the first integrated scheme (CFNS) for simultaneously minimising the cost function and satisfying the constraint. Preliminary experiments in the domain of fundamentalmatrix estimation show that CFNS generates rank2 estimates with smaller cost function values than rank2 corrected FNS estimates. Furthermore, when compared with the HartleyZisserman Gold Standard method, CFNS is seen to generate results of comparable quality in a fraction of the time.
A New Constrained Parameter Estimator: Experiments In Fundamental Matrix Computation
, 2002
"... In recent work the authors proposed a wideranging method for estimating parameters that constrain image feature locations and satisfy a constraint not involving image data. The present work illustrates the use of the method with experiments concerning estimation of the fundamental matrix. Result ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
In recent work the authors proposed a wideranging method for estimating parameters that constrain image feature locations and satisfy a constraint not involving image data. The present work illustrates the use of the method with experiments concerning estimation of the fundamental matrix. Results are given for both synthetic and real images. It is demonstrated that the method gives results commensurate with, or superior to, previous approaches, with the advantage of being faster than comparable methods.
FNS and HEIV: relating two vision parameter estimation frameworks
 In Proc. 12th Int. Conf. Image Analysis and Processing
, 2003
"... Problems requiring accurate determination of parameters from imagebased quantities arise often in computer vision. Two recent, independently developed frameworks for estimating such parameters are the FNS and HEIV schemes. Here it is shown that FNS and a core version of HEIV are essentially equival ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Problems requiring accurate determination of parameters from imagebased quantities arise often in computer vision. Two recent, independently developed frameworks for estimating such parameters are the FNS and HEIV schemes. Here it is shown that FNS and a core version of HEIV are essentially equivalent, solving a common underlying equation via different means. The analysis is driven by the search for a nondegenerate form of a certain generalised eigenvalue problem, and effectively leads to a new derivation of the relevant case of the HEIV algorithm. This work may be seen as an extension of previous efforts to rationalise and interrelate a spectrum of estimators, including the renormalisation method of Kanatani and the normalised eightpoint method of Hartley. 1.
A statistical rationalisation of Hartley’s normalised eightpoint algorithm
 In Proc. 12th Int. Conf. Image Analysis and Processing
, 2003
"... The eightpoint algorithm of Hartley occupies an important place in computer vision, notably as a means of providing an initial value of the fundamental matrix for use in iterative estimation methods. In this paper, a novel explanation is given for the improvement in performance of the eightpoint al ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
The eightpoint algorithm of Hartley occupies an important place in computer vision, notably as a means of providing an initial value of the fundamental matrix for use in iterative estimation methods. In this paper, a novel explanation is given for the improvement in performance of the eightpoint algorithm that results from using normalised data. A first step is singling out a cost function that the normalised algorithm acts to minimise. The cost function is then shown to be statistically better founded than the cost function associated with the nonnormalised algorithm. This augments the original argument that improved performance is due to the better conditioning of a pivotal matrix. Experimental results are given that support the adopted approach. This work continues a wider effort to place a variety of estimation techniques within a coherent framework. 1.
Simple, Fast and Accurate Estimation of the Fundamental Matrix Using the Extended EightPoint Schemes
 21ST BRITISH MACHINE VISION CONFERENCE (BMVC)
, 2010
"... The eightpoint scheme is the simplest and fastest scheme for estimating the fundamental matrix (FM) from a number of noisy correspondences. As it ignores the fact that the FM must be singular, the resulting FM estimate is often inaccurate. Existing schemes that take the singularity constraint into ..."
Abstract
 Add to MetaCart
The eightpoint scheme is the simplest and fastest scheme for estimating the fundamental matrix (FM) from a number of noisy correspondences. As it ignores the fact that the FM must be singular, the resulting FM estimate is often inaccurate. Existing schemes that take the singularity constraint into consideration are several times slower and significantly more difficult to implement and understand. This paper describes extended versions of the eightpoint (8P) and the weighted eightpoint (W8P) schemes that effectively take the singularity constraint into consideration without sacrificing the efficiency and the simplicity of both schemes. The proposed schemes are respectively called the extended eightpoint scheme (E8P) and the extended weighted eightpoint scheme (EW8P). The E8P scheme was experimentally found to give exactly the same results as Hartley's algebraic distance minimization scheme while being almost as fast as the simplest scheme (i.e., the 8P scheme). At the expense of extra calculations per iteration, the EW8P scheme permits the use of geometric cost functions and, more importantly, robust weighting functions. It was experimentally found to give nearoptimal results while being 816 times faster than the more complicated schemes such as LevenbergMarquardt schemes. The FM estimates obtained by the E8P and the EW8P schemes perfectly satisfy the singularity constraint, eliminating the need to enforce the rank2 constraint in a postprocessing step.
A Consistency Result for the Normalized EightPoint Algorithm
, 2007
"... A recently proposed argument to explain the improved performance of the eightpoint algorithm that results from using normalized data [IEEE Trans. Pattern Anal. Mach. Intell., 25(9):1172–1177, 2003] relies upon adoption of a certain model for statistical data distribution. Under this model, the cost ..."
Abstract
 Add to MetaCart
(Show Context)
A recently proposed argument to explain the improved performance of the eightpoint algorithm that results from using normalized data [IEEE Trans. Pattern Anal. Mach. Intell., 25(9):1172–1177, 2003] relies upon adoption of a certain model for statistical data distribution. Under this model, the cost function that underlies the algorithm operating on the normalized data is statistically more advantageous than the cost function that underpins the algorithm using unnormalized data. Here we extend this explanation by introducing a more refined, structured model for data distribution. Under the extended model, the normalized eightpoint algorithm turns out to be approximately consistent in a statistical sense. The proposed extension provides a link between the existing statistical rationalization of the normalized eightpoint algorithm and the approach of Mühlich and Mester for enhancing total least squares estimation methods via equilibration. Our contribution forms part of a wider effort to rationalize and interrelate foundational methods in vision parameter estimation.