### Table 6. Detection of Invariant Violations

2001

"... In PAGE 17: ... Model Invariant Elevator 2(:opened _ stopped) POTS :3(P 1 @s 1 ^ P 2 @s 2 ^ P 3 @s 3 ^ P 4 @s 4 ) The search for the violation is performed with H :i as heuristic estimate, where i is the system invariant. Table6 depicts the results of experiments with two models: an Elevator model, and the model of a Public Old Telephon System (POTS). The latter is not scalable, and the former has been con gurated with 3 oors.... ..."

Cited by 29

### Table 3: Overview of color spaces and their invariance properties based on Gevers [13]. + denotes invariant and - denotes sensitivity of the color space to the imaging condition.

2007

"... In PAGE 21: ...2 Color spaces and invariance In this section we will discuss various color spaces which can be used to represent a color and their invariance properties. Table3 gives an overview of the invariance properties of the color spaces based on Gevers [13]. The listed properties will not be repeated in the text.... ..."

### Table 16: Semantics of an invariant rule

"... In PAGE 21: ...Action [[Rci]] is rule Rci semantics; it says that the rule is represented by the set of all actions N ci(vci) whose parameters in v respect local compu- tation function f and predicate expr and whose mf parameter corresponds to the address of the parent space of the space having address ms. Table16 shows the semantics of a generic invariant rule Rinv. Action N inv(ms; mf ; v) is enabled if ms is the address of a space containing rule Rinv, if mf is the address of the parent space of the space having address ms, if values in v respect local computation function f and predicate expr and if the space having address ms contains the tuples to be read.... ..."

### Table 1 Variance of the clusters in the r}b space and afii9825}afii9826 (Invariant) space

2000

"... In PAGE 6: ... While it is hard to discriminate the clusters in the r}b space due to the illuminant change, the chromatic invariants in the afii9825}afii9826 space provide tight clusters even under a wide varietyof illuminant colors. Table1 shows the variance of the each cluster. The leftmost column represents the label of the Macbeth color checker and the next two columns show the vari- ance of the chromaticityvalues in the r}b space and chromatic invariants in the afii9825}afii9826 space, respectively.... ..."

### Table 1: Percent of trials reconstructed correctly for the Jukes-Cantor model over the entire parameter space and the Felsenstein zone for the respective metrics.

2008

"... In PAGE 7: ... See Figure 3 for an example of the di erence and be warned that comparison between simulations studies done in di erent ways is di cult. Table1 shows the results of 100 simulations at each of the 1444 parameter values for various sequence lengths using the JC69 model. It gives the percent correct over all 144,400 trials for ve di erent methods: invariants with l1, l2, and A-norms and neighbor joining (using Jukes-Cantor distances and allowing in nite distances).... In PAGE 8: ... When trained on the Felsenstein zone, the learned metric can perform even better. Table1 shows the result of training a metric on this zone. Notice that the A-norm is now quite a bit better than neighbor joining, even though the l1 and l2 norms are terrible.... ..."

### Table 1. Requirements of a metric space.

"... In PAGE 4: ... Let us continue by examining the usability of the al- gorithm compared to the intuitive algorithm. First of all the distance calculated using the intuitive algorithm does not fulfil three of the four requirements set for a metric space (see the requirements in Table1 ). The first condi- tion is more dependent on the point-to-point distance mea- sure and holds whenever the Lp norm is applied.... ..."

### Table 2: Lower bounds for metric space problems

### Table 7. Requirements of a metric space.

2006

### Table 1. The dimensions of the spaces of type n invariants mod- ulo type (n ? 1) invariants of the pure braid groups, Pk (included for comparison).

### Table 1: Percent of trials reconstructed correctly for the Jukes-Cantor model over the entire parame- ter space and the restricted zone (with short internal edges) for the respective metrics. The l1 and l2 norms are the standard norms on the invariants, the A-norm is our learned norm. The column NJ refers to neighbor joining.

in Abstract

"... In PAGE 6: ... See Tables 1 and 2 and Figure 2 for details. The metric trained on trees with short internal edges is quite a bit better than neighbor joining on this region, even though the l1 and l2 norms are barely better than random guessing ( Table1 ). However, this learned norm is slightly worse on the whole parameter space than the metric trained on the whole space.... ..."