### Table 2 - RMS errors for motion estimation of different

1998

"... In PAGE 19: ... This example illustrates the behavior of integrated region and point tracking under complex imaging conditions. Table2 gives the RMS estimate error produced by our tracking system for several test sequences, including the Park sequence shown in figure 6a. This latter sequence shows high RMS error, which we believe, is due to imaging distortions that occur in the trees as a result of the camera translation.... ..."

### Table 2: Motion estimate errors in degrees Motions Rotation Errors Translation Errors

2001

"... In PAGE 15: ...4 Experiment: Motion Recovery from Real Images We simply tested our algorithm on a set of real images taken by a commercial pan-tilt camera. Figure 6 shows four images of a cubic corner with feature points, Figure 7 plots the estimated and hand measured actual camera location, and Table2 gives the errors between the estimated and measured motions. The camera is self-calibrated by Hartley apos;s method for a pure rotating camera.... ..."

Cited by 3

### Table 1: Motion estimation results.

"... In PAGE 6: ... Again, the dashed line corresponds to the expected performance of the algorithm established using Monte Carlo simulation. Table1 summarizes the additional motion estimation results obtained from processing the approach and descent sequences obtained using 50 or 500 features and linear or linear+nonlinear motion estimation For the 50 feature descent sequence and the linear motion estimation algorithm, the average translation error is 0.... In PAGE 6: ... The approach sequence takes slightly longer to process because the larger image requires more time to detect features. The results in Table1 , show that in general the addition of the nonlinear motion estimation algorithm does not improve the results of motion estimation all that much. This is because for vertical descent, the motion computed using the linear algorithm is very constrained, so the results are very close to those obtained using the nonlinear algorithm.... In PAGE 7: ...otions (e.g., orbital motion) the nonlinear algorithm will result in improved motion estimation and should be used. Table1 also shows that adding features (50 vs. 500) does not improve motion estimation all that much.... ..."

### Table 2. AC error statistics for proper motions.

121

"... In PAGE 2: ... for the corresponding declination zone. The values for AC as given in Table2 have been used for all stars of a given eld because the few images per AC star do not allow derivation of a meaningful error for individual stars. To a good approximation, the AC position error does not depend on magnitude over the range of magnitudes (7.... In PAGE 2: ...ange of magnitudes (7.5 to 11.5) considered here. Table2 also gives average values for the AC epoch, the estimated proper motion error, pm, and the er- ror contribution due to proper motion errors pmc to our 1997 epoch data. Individual observational epochs and Tycho Catalogue data have been used for each star for the following reductions and error statistics.... ..."

### Table 1: Motion estimate errors in degrees Frames Rotation Errors Translation Errors

"... In PAGE 7: ... Even though the motion is rectilinear, relative scales still can be initialized by triangulation, because image measurements are noisy. Table1 shows the error between the estimated motion and the actual motion of the camera. It can be observed that the algorithm is able to recover the correct motion and that rotation estimates tend to be more accurate than translation estimates.... ..."

### Table 2: Results of camera motion estimations on 3 synthetic sequences of 200 images. The errors are averaged errors computed over each sequence.

2008

### Table 1. Stereoscopy and motion on real data

1996

"... In PAGE 4: ...tage (i.e., correspondence between image points in the first pair) is assumed given and at any time instant the estimated disparity field t was used for the joint motion and disparity estimation between frames at the next time instant. Results from this phase are summarized in Table1 , where LL refers to the mean square DFD between the two left images, RR to the mean square DFD between the two right images and RL to the mean square DFD between the images of the stereo- scopic pair. For the evaluation of the last quantity only points where Eq.... ..."

Cited by 1

### Table 1. Stereoscopy and motion on real data

1996

"... In PAGE 4: ...tage (i.e., correspondence between image points in the first pair) is assumed given and at any time instant the estimated disparity field t was used for the joint motion and disparity estimation between frames at the next time instant. Results from this phase are summarized in Table1 , where LL refers to the mean square DFD between the two left images, RR to the mean square DFD between the two right images and RL to the mean square DFD between the images of the stereo- scopic pair. For the evaluation of the last quantity only points where Eq.... ..."

Cited by 1

### Table 1: MSE of ^ p at t = 2 for interlaced test image with di erent sets of synthetic motion parameters; lines with show the error for estimation without accelera- tion (linear motion model).

### Table 7 Joint estimation of sectoral equations: Sectoral government expenditure and concessionary loan {Govermment expenditure data from GFS}

"... In PAGE 27: ...ime t (t=l,...,T). To estimate the above system of equations, we use the Generalized Method of Moments (GMM) technique as discussed in Hansen and Singleton [1982].21 Coefficient estimates and other statistics are reported in Table7 (using public expenditure data from GFS) and Table 8 (using public expenditure data from Easterly and Rebelo). To eliminate fixed or random effects, we differenced the foreign aid and government spending variables on the right-hand side in equation (10).... ..."