Results 1  10
of
20
Estimation of relative camera positions for uncalibrated cameras
, 1992
"... Abstract. This paper considers, the determination of internal camera parameters from two views of a point set in three dimensions. A noniterative algorithm is given for determining the focal lengths of the two cameras, as well as their relative placement, assuming all other internal camera paramete ..."
Abstract

Cited by 309 (24 self)
 Add to MetaCart
(Show Context)
Abstract. This paper considers, the determination of internal camera parameters from two views of a point set in three dimensions. A noniterative algorithm is given for determining the focal lengths of the two cameras, as well as their relative placement, assuming all other internal camera parameters to be known. It is shown that this is all the information that may be deduced from a set of image correspondences. 1
Linear Pushbroom Cameras
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 1994
"... Modelling th# push broom sensors commonly used in satellite imagery is quite di#cult and computationally intensive due to th# complicated motion ofth# orbiting satellite with respect to th# rotating earth# In addition, th# math#46 tical model is quite complex, involving orbital dynamics, andh#(0k is ..."
Abstract

Cited by 161 (6 self)
 Add to MetaCart
(Show Context)
Modelling th# push broom sensors commonly used in satellite imagery is quite di#cult and computationally intensive due to th# complicated motion ofth# orbiting satellite with respect to th# rotating earth# In addition, th# math#46 tical model is quite complex, involving orbital dynamics, andh#(0k is di#cult to analyze. Inth#A paper, a simplified model of apush broom sensor(th# linear push broom model) is introduced. Ith as th e advantage of computational simplicity wh#A9 atth# same time giving very accurate results compared with th# full orbitingpush broom model. Meth# ds are given for solving th# major standardph# togrammetric problems for th e linear push broom sensor. Simple noniterative solutions are given for th# following problems : computation of th# model parameters from groundcontrol points; determination of relative model parameters from image correspondences between two images; scene reconstruction given image correspondences and groundcontrol points. In addition, th# linearpush broom model leads toth#0 retical insigh ts th# t will be approximately valid for th# full model as well.Th# epipolar geometry of linear push broom cameras in investigated and sh own to be totally di#erent from th at of a perspective camera. Neverth eless, a matrix analogous to th e essential matrix of perspective cameras issh own to exist for linear push broom sensors. Fromth#0 it is sh# wn th# t a scene is determined up to an a#ne transformation from two viewswith linearpush broom cameras. Keywords :push broom sensor, satellite image, essential matrixph# togrammetry, camera model The research describ ed in this paper hasb een supportedb y DARPA Contract #MDA97291 C0053 1 Real Push broom sensors are commonly used in satellite cameras, notably th# SPOT satellite forth# generatio...
Visual Control Of Robot Manipulators  A Review
 Visual Servoing
, 1994
"... This paper attempts to present a comprehensive summary of research results in the use of visual information to control robot manipulators and related mechanisms. An extensive bibliography is provided which also includes important papers from the elemental disciplines upon which visual servoing is ba ..."
Abstract

Cited by 63 (1 self)
 Add to MetaCart
This paper attempts to present a comprehensive summary of research results in the use of visual information to control robot manipulators and related mechanisms. An extensive bibliography is provided which also includes important papers from the elemental disciplines upon which visual servoing is based. The research results are discussed in terms of historical context, commonality of function, algorithmic approach and method of implementation. 1 Introduction This paper presents the history, and reviews current research into the use of visual information for the control of robot manipulators and mechanisms. Visual control of manipulators promises substantial advantages when working with targets whose position is unknown, or with manipulators which may be flexible or inaccurate. The reported use of visual information to guide robots, or more generally mechanisms, is quite extensive and encompasses manufacturing applications, teleoperation, missile tracking cameras, fruit picking as well...
Internal Camera Calibration using Rotation and Geometric Shapes
 AITR1426, MASTER'S THESIS, MASSACHUSSETS INSTITUTE OF TECHNOLOGY, ARTIFICIAL INTELLIGENCE LABORATORY
, 1993
"... This paper describes a simple method for internal camera calibration for computer vision systems. It is intended for use with medium to wide angle camera lenses. With modification it can be used for longer focal lengths. This method is based on tracking image features through a sequence of images wh ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
This paper describes a simple method for internal camera calibration for computer vision systems. It is intended for use with medium to wide angle camera lenses. With modification it can be used for longer focal lengths. This method is based on tracking image features through a sequence of images while the camera undergoes pure rotation. This method does not require a special calibration object. The location of the features relative to the camera or to each other need not be known. It is only required that the features can be located accurately in the image. This method can therefore be used both for laboratory calibration and for self calibration in autonomous robots working in unstructured environments. The method works when features can be located to single pixel accuracy, but subpixel accuracy should be used if available. In the
Reflectance Analysis for 3D Computer Graphics Model Generation
 GRAPHICAL MODELS AND IMAGE PROCESSING
, 1996
"... For synthesizing realistic images of a real three dimensional object, reflectance properties of the object surface, as well as the object shape, need to be measured. This paper describes one approach to create a three dimensional object model with physically correct reflectance properties by observi ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
For synthesizing realistic images of a real three dimensional object, reflectance properties of the object surface, as well as the object shape, need to be measured. This paper describes one approach to create a three dimensional object model with physically correct reflectance properties by observing a real object. The approach consists of three steps. First, a sequence of range images and color images is measured by rotating a real object on a rotary table with fixed viewing and illumination directions. Then, the object shape is obtained as a collection of triangular patches by merging multiple range images. Secondly, by using the recovered object shape, color pixel intensities of the color image sequence are mapped to the object surface and separated into the diffuse and specular reflection components. Finally, the separated reflection components are used to estimate parameters of the Lambertian reflection model and a simplified TorranceSparrow reflection model. We have successfully tested our approach using images of a real object. Synthesized images of the object under arbitrary illumination conditions are shown in this paper.
Iterative MultiStep Explicit Camera Calibration
 In Proc. of the 6th International Conference on Computer Vision
, 1998
"... Perspective camera calibration has been in the last decades a research subject for a large group of researchers and as a result several camera calibration methodologies can be found in the literature. However only a small number of those methods base their approaches on the use of monoplane calibrat ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Perspective camera calibration has been in the last decades a research subject for a large group of researchers and as a result several camera calibration methodologies can be found in the literature. However only a small number of those methods base their approaches on the use of monoplane calibration points. This paper describes one of those methodologies that uses monoplane calibration points to realize an explicit 3D camera calibration. To avoid the singularity obtained with the calibration equations when monoplane calibration points are used, this method computes the calibration parameters in a multistep procedure and requires a firstguess solution for the intrinsic parameters. These parameters are updated and their accuracy increased through an iterative procedure. A stability analysis as a function of the pose of the camera is presented. Camera pose view strategies for accurate camera orientation computation can be extracted from the pose view stability analysis. 1 Introducti...
Effects of Camera Alignment Errors on Stereoscopic Depth Estimates
"... : We present in this paper a new analysis of relative sensitivity/importance of camera calibration/alignment parameters on the performance of stereoscopic depth reconstruction. This quantitative analysis provides formulae which relate different parameter errors to the 3D reconstruction measurements. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
: We present in this paper a new analysis of relative sensitivity/importance of camera calibration/alignment parameters on the performance of stereoscopic depth reconstruction. This quantitative analysis provides formulae which relate different parameter errors to the 3D reconstruction measurements. The results of this analysis provide specifications of acceptable tolerances in individual calibration parameters for given 3D measurement error tolerances. This information is useful in designing practical stereoscopic vision systems. KEYWORDS Stereoscopic Vision Camera Calibration/Alignment 1 Introduction Camera calibration is a very important issue that must be addressed when developing a practical stereoscopic vision system. For a given stereoscopic vision system, once stereo correspondence is successfully established, the accuracy of 3D measurements depends on the overall geometric structure of the model in relation to object distances. The geometric structure (and hence the error) de...
Registration of Multimodal Medical Images  Exploiting Sensor Relationships
, 1994
"... This report gives the numerical procedure to mathematically transform between a rotation matrix and the corresponding quaternion so that the result will be insensitive to noise. First we determine the direction of the optical axis: ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This report gives the numerical procedure to mathematically transform between a rotation matrix and the corresponding quaternion so that the result will be insensitive to noise. First we determine the direction of the optical axis:
Including the Effect of Calibration on Voxelization Errors
, 2002
"... This thesis characterizes the problem of relative camera calibration in the context of threedimensional volumetric reconstruction. The general effects of camera calibration errors on different parameters of the projection matrix are well understood. In addition, calibration error and Euclidean worl ..."
Abstract
 Add to MetaCart
(Show Context)
This thesis characterizes the problem of relative camera calibration in the context of threedimensional volumetric reconstruction. The general effects of camera calibration errors on different parameters of the projection matrix are well understood. In addition, calibration error and Euclidean world errors for a single camera can be related via the inverse perspective projection. However, there has been little analysis of camera calibration for a large number of views and how those errors directly influence the accuracy of recovered threedimensional models. A specific analysis of how camera calibration error is propagated to reconstruction errors using traditional voxel coloring algorithms is discussed. A review of the Voxel coloring algorithm is included and the general methods applied in the coloring algorithm are related to camera error. In addition, a specific, but common, experimental setup used to acquire realworld objects through voxel coloring is introduced. Methods for relative calibration for this specific setup are discussed as well as a method to measure calibration error. An analysis of effect of these errors on voxel coloring is presented, as well as a discussion concerning the effects of the resulting worldspace error.