### Table 2: Shapes of complexity graphs

1993

"... In PAGE 54: ... Therefore, it is interesting to investigate in which of the cases not only the worst-case complexity, but also the average-case complexity scales linearly with the complexity of the domain. Table2 shows which graphs of the average-case complexity deviate from the corre- sponding graphs of the worst-case complexity. The rst entry is always the shape of the graph for the worst-case complexity (as stated above, i.... ..."

Cited by 39

### Table 2: Shapes of complexity graphs

1993

"... In PAGE 54: ... Therefore, it is interesting to investigate in whichof the cases not only the worst-case complexity, but also the average-case complexity scales linearly with the complexity of the domain. Table2 shows which graphs of the average-case complexity deviate from the corre- sponding graphs of the worst-case complexity. The #0Crst entry is always the shape of the graph for the worst-case complexity #28as stated above, i.... ..."

Cited by 39

### Table 3: The effect of sparsification

2000

"... In PAGE 5: ... An efficient direct linear solver based on Cholesky fac- tors was used in all the experiments. The extracted power grids of four high performance general purpose/DSP microprocessor chips were used to benchmark the performance of macromodeling (Ta- bles 1 and 2) and sparsification ( Table3 ) techniques. Chips 1, 2 and 4 are DSP and communication chips whose power grids are imple- mented in 3 layers of metal.... ..."

Cited by 45

### Table 4.3: Comparison between graph pruning and propositional con- trol of category I control rules. before simplification after simplification problem len. graph propositional graph propositioanl

2000

Cited by 19

### Table 1 Average shape dissimilarity value for individual simplified lines generated from different line simplification algorithms

2005

"... In PAGE 11: ...igher similarity (Fig. 10b) to lower similarity (Fig. 10d) relative to the original coastline (Fig. 10a). The average shape dissimilarity value for each simplified line is tabulated in Table1 . This table also gives a similar result to Fig.... ..."

### Table 1: Runtime results comparing state-of-the-art CNF-based BMC with a tuned BMC implementation based on AIG reasoning, SAT sweeping, dynamic simplification, and simplification through induction.

"... In PAGE 7: ... We compared such an implementation with a state-of-the-art BMC implementation that is based on a plain CNF translation of the unfolded formulas [1]. Both implementations utilize the same 0 20 40 60 80 100 2 4 6 8 10 12 14 Relative reduction of vertices compared to first frame in % Time frame Simplifcation of transition relation d1 d2 d3 d4 d5 d6 d7 d8 d9 d10 d11 d12 d13 d14 d15 d16 d17 d18 d19 d20 d21 Figure 4: Simplification of the transition relation for the industrial benchmarks used in Table1 . At each time frame the size of the transition relation in terms of AND vertices is compared with the... In PAGE 7: ... core SAT solver [4] which makes them to some degree compara- ble. Table1 provides an overview of the results on the set of in- dustrial property checking benchmarks. The table lists the number of state variables and the lengths of the shortest counter-example in columns 2 and 3.... ..."

### Table 1: Sparsification in the standard basis.

2000

"... In PAGE 5: ... A common approach to reducing the density of coupling in the substrate conductance matrix is simply to drop entries that, in the normal basis, are small. Table1 shows a sparsity ratio and error obtained by thresholding without a change of basis, demonstrating that this more obvious approach can be quite ineffective. However, when the multiscale basis is employed, much better results can be obtained.... ..."

Cited by 9

### Table 1: Sparsification in the standard basis.

2000

"... In PAGE 5: ... A common approach to reducing the density of coupling in the substrate conductance matrix is simply to drop entries that, in the normal basis, are small. Table1 shows a sparsity ratio and error obtained by thresholding without a change of basis, demonstrating that this more obvious approach can be quite ineffective. However, when the multiscale basis is employed, much better results can be obtained.... ..."

Cited by 9

### TABLE I RESULTS OF NUMERICAL SPARSIFICATION

2003

Cited by 9

### Table 5. Experiments with sparsification on top of Spanning Tree, and Spanning Tree on di erent graphs and sequences of 500 updates. For each algorithm the left column is the preprocessing and the right colum is the processing time in seconds. Each data set is the average of ten di erent samples.

1996

Cited by 24