### Table 3: A GapL algorithm for the determinant over non-negative integers

"... In PAGE 22: ...) Essentially, HA gives us a uniform polynomial size, polynomial width branching program, corresponding precisely to GapL. Table3 lists the code for an NL machine computing, through its gap 6.2-10 function, the determinant of a matrix A with non-negative integral entries.... ..."

### Table 3: A GapL algorithm for the determinant over non-negative integers

"... In PAGE 22: ...) Essentially, HA gives us a uniform polynomial size, polynomial width branching program, corresponding precisely to GapL. Table3 lists the code for an NL machine computing, through its gap 6.2-10 function, the determinant of a matrix A with non-negative integral entries.... ..."

### Table 1: The algorithms tested in this article. These include a number of different algorithms for finding matrix factorisations under non-negativity constraints and a neural network algorithm for performing competitive learning through dendritic inhibition.

"... In PAGE 4: ...3. 3 Results In each of the experiments reported below, all the algorithms listed in Table1 were applied to learning the com- ponents in a set of p training images. The average number of components that were correctly identified over the course of 25 trials was recorded.... ..."

Cited by 3

### Table 1: Data reconstruction errors under the L2 norm. cICA = contextual ICA, NMF = Non-negative matrix factorisation with fi=0, NMFe = NMF with exponential prior having fi = 0:1

"... In PAGE 6: ...Table 1: Data reconstruction errors under the L2 norm. cICA = contextual ICA, NMF = Non-negative matrix factorisation with fi=0, NMFe = NMF with exponential prior having fi = 0:1 Table1 shows the data reconstruction results across... ..."

### Table 1: Summary of algorithms for Weighted Non-Negative Matrix Euclidean Distance (ED) KL Divergence (KLD)

### Table 1: The algorithms tested in this article. These include a number of different algorithms for finding matrix factorisations under non-negativity constraints and a neural network algo- rithm for performing competitive learning through dendritic inhibition.

"... In PAGE 6: ...ogether with a concrete example of its function in section 3.3. 3. Results In each of the experiments reported below, all the algorithms listed in Table1 were applied to learning the components in a set of p training images. The average number of components that were correctly identified over the course of 25 trials was recorded.... ..."

Cited by 3

### Table 2.2. Random waodag summary. were randomly instantiated between two nodes.12 Finally, hypothesis nodes are identi ed and are arbitrarily assigned some non-negative cost. Table 2.2 summarizes the set of randomly generated waodags for this experiment. Performing a least-squares exponential t gives us e?5:49+:0614x. Consider the logarithmic plot of our linear constraint satisfaction approach in Fig- ure 2.3. Again, we can clearly see that our linear constraint satisfaction approach actually exhibits an expected subexponential growth rate. By fur- ther attempting to t our data to axb, we get :0079188x2:0308 as our growth curve. Again, the error t actually improved roughly 2300%.

### Table 6.4 for the synthesis of new traces. A realization of such a trace is demonstrated in Figure 6.11 below. Observe that the trace is non-negative like it was designed. This is a major advantage of the multifractal wavelet model.

2003

### Table 1. A summary of certain non-negative integer parameters

"... In PAGE 24: ... The next lemma involves four of the seven parameters 1{ 5, s, and t. In Table1 , which appears below, these seven parameters are summarized. 11.... In PAGE 26: ...Table 1. A summary of certain non-negative integer parameters In Table1 , for easy reference, we have summarized information about the seven parameters 1{ 5, s, and t each of which must be a non-negative integer. Four of these parameters, t, 1, 4, and 5, change their values according to the case we are in.... ..."

### Table 1. A summary of certain non-negative integer parameters

"... In PAGE 26: ... The next lemma involves four of the seven parameters 1{ 5, s, and t. In Table1 , which appears below, these seven parameters are summarized. 12.... In PAGE 27: ... = 1. By Lemmas 7.1, 7.2, and 7.3, in every case, J2 2 G;(H0 2). In Table1 , for easy reference, we have summarized information about the the seven parameters 1{ 5, s, and t each of which must be a non-negative integer. Four of these parameters, t, 1, 4, and 5, change their values according to the case we are in.... ..."