### Table 1: Categorization of Manifold Learning Methods

2007

"... In PAGE 2: ...Table 1: Categorization of Manifold Learning Methods 2 Manifold Learning Methods and their connections to Distance Metric Learning Manifold Learning approaches can be categorized along the following two dimensions: first, the learnt embedding is linear or nonlinear; and second, the structure to be pre- served is global or local (see Table1 ). Based on the analysis in section 1, all the linear methods in Table 1 except Multidimensional Scaling (MDS), learn an explicit linear projective mapping and can be interpreted as the problem of distance metric learning.... In PAGE 2: ...Table 1: Categorization of Manifold Learning Methods 2 Manifold Learning Methods and their connections to Distance Metric Learning Manifold Learning approaches can be categorized along the following two dimensions: first, the learnt embedding is linear or nonlinear; and second, the structure to be pre- served is global or local (see Table 1). Based on the analysis in section 1, all the linear methods in Table1 except Multidimensional Scaling (MDS), learn an explicit linear projective mapping and can be interpreted as the problem of distance metric learning. MDS finds the low-rank projection that best preserves the inter-point distance matrix E.... ..."

### Table 1: Embedding data sets into manifolds

"... In PAGE 3: ... To evaluate the accuracy of the manifolds obtained we used several measures. Table1 compares the manifold with the best plane embedding in terms of (1) average error: the ratio of the objective function value (I) to the sum of squares of all the e ective distances, (2) av- erage expansion: the average expansion factor for pairs whose distance went up compared to the original, (3) average contraction: the average shrinking factor for pairs whose distance went down and (4) maximum dis- tortion: the product of maximum contraction (max fac- tor by which some edge length was reduced) and maxi- mum expansion (factor by which some edge length was... In PAGE 4: ... To test how well the learned manifold generalizes, we dropped at random 8% of the measured e ective dis- tances (edges) from the data sets, computed the man- ifold on the rest of the observations and made predic- tions on the 8% not used in computing the manifold. The last column in Table1 shows that the manifold prediction error is low on all the measures and is com- parable to that on the full set of values. From this we conclude that the manifold captures and generalizes wireless connectivity accurately.... ..."

### Table 2. A summary of retrieval performance showing the effect of learning..

"... In PAGE 6: ....3 Number of samples vs. retrieval performance Figure 6 shows the change in retrieval performance due to learning for the SR-AAD, MR-AAD, SR-SPRH, and MR-SPRH shape features. Table2 shows a summary of performance evaluation results. Overall, the manifold learning increased the performance by about 5%, and the multiresolution method added another 5%, for an overall increase of about 10% for both AAD and SPRH.... In PAGE 6: ... A set of 256 uniformly distributed samples means, on average, only one sample are taken per dimension. In both Table2 and Figure 6, the numbers of learned models are limited to either 4000 or 5000 in spite of the corpus size of 11,818. This limitation is due to the large memory consumption of the RBF network regression algorithm we used, and also to the memory limitation of the MatLab we used.... ..."

### Table 1: Manifold Variables Manifold M T @

1995

"... In PAGE 6: ... For such an observer, the velocity is just the normal vector u , and the acceleration of the normal vector is given by a = u r u = N?1h r N. A summary of the notation described above is given in Table1 of the ap- pendix. 2.... In PAGE 38: ... Foliation of M along I allows us to de ne the lapse, N, and the shift, N . The various manifolds we will consider, and some of the tensors de ned on them are summarized in Table1 . We can construct the following densities on Table 1: Manifold Variables Manifold M T @ ... ..."

Cited by 3

### Table 1. Classi cation accuracies on the USPS database with different subsets, using our algorithm, GPC and SVMs both in the input space and in the spaces learned by LDA and GDA. GDA over ts and performs at chance levels. LDA discovers some of the structure of the manifold, but has more classi cation errors than D-GPLVM. All classi ers in the input space perform no better than chance.

2007

"... In PAGE 6: ... The dimensionality of the latent space is 1. In Table1 , we compare our algo- rithm to GPC and SVMs both in the input space and in the spaces learned by LDA and GDA. As expected, GDA over- ts and performs at chance levels.... ..."

Cited by 1

### Table 1. Singular configuration manifolds

2002

"... In PAGE 14: ... The other approach is more elaborate and takes a subspace of the entire con- figuration space into consideration. Observing Table1 , it can be seen that every singularity manifold shrinks to a point or a finite set of points if only the appropriate components of q are taken.... ..."

### Table 2.1 Representations of subspace manifolds.

1998

Cited by 186

### Table 1. Translation of MANIFOLD constructs into Promela instructions.

"... In PAGE 8: ...2 The model of a MANIFOLD stream The behaviour of a MANIFOLD application, with the exception of the events control mechanism, is a game of dynamically creating new instances of processes and con- necting or reconnecting their ports through streams. We consider now an application in which streams are set up, we will then analyze the Promela model; howewer, we introduce another style in writing a Promela model, namely, we use macro de nitions (described in Table1 ) which we developed with the aim of helping the user and to ren- der the syntactic avour of a MANIFOLD application. The MANIFOLD application computes Fibonacci numbers.... In PAGE 12: ... These channels are passed as arguments of the control message cconnect. The code corresponding to what we have described is reported in Table1 as the following macro de nitions: PUT(INPORT), GET(OUTPORT). 4 Adequacy of the Promela models In this section we try to address the problem of the adequacy of the Promela models we designed.... ..."

### Table 1. Translation of MANIFOLD constructs into Promela instructions.

"... In PAGE 8: ...2 The model of a MANIFOLD stream The behaviour of a MANIFOLD application, with the exception of the events control mechanism, is a game of dynamically creating new instances of processes and con- necting or reconnecting their ports through streams. We consider now an application in which streams are set up, we will then analyze the Promela model;; howewer, we introduce another style in writing a Promela model, namely,we use macro de nitions (described in Table1 ) whichwedeveloped with the aim of helping the user and to ren- der the syntactic avour of a MANIFOLD application. The MANIFOLD application computes Fibonacci numbers.... In PAGE 12: ... These channels are passed as arguments of the control message cconnect. The code corresponding to what wehave described is reported in Table1 as the following macro de nitions: PUT(INPORT), GET(OUTPORT). 4 Adequacy of the Promela models In this section we try to address the problem of the adequacy of the Promela models we designed.... ..."

### Table 1. Invariant manifolds in the symmetric representation.

in c ○ World Scientific Publishing Company DYNAMICS OF THREE COUPLED EXCITABLE CELLS WITH D3 SYMMETRY

1998

"... In PAGE 5: ...n the following way [Ashwin et al., 1990]. A real variable which corresponds to the aver- age phase, that is: = ( 1 + 2 + 3)=3, and a complex variable = 1 + ei2 3 2 + ei4 3 3.The invariant manifolds in this coordinate system are shown in Table1 . The action of the symmetry group corresponds to (Fig.... ..."