### Table 43: Overview of axioms of concurrent process algebras.

1996

"... In PAGE 131: ... We have the following legend: A indicates that the axiom is present in the theory, A indicates that the axiom is not present but can be derived (for closed terms) from the other axioms in the theory, A indicates that the axiom does not hold in the theory, A indicates that the axiom is meaningless in the theory, as there is a signature conflict. In Table43 on page 135 we give an overview of the axioms pertaining to merge operators of the (concurrent) process algebras treated in Section 3. The legend is the same as for the previous table.... ..."

Cited by 7

### Table 5: The ACP -axioms of combinatory process algebra.

"... In PAGE 7: ... 2 The second group of axioms consists of the ACP -axioms introduced in [BK85] and extending the ACP-axioms of [BK84]. The schemata, from which the axioms can be obtained by a type assignment to the variables and operators, are listed in Table5 . They di er from all the other schemata in so far as only restricted type assignment is permitted.... In PAGE 24: ... It should be clear that the axioms of argumentwise evaluation (Table 4) are satis ed under this interpretation. Moreover, as BP is a model of the ACP -axioms ( Table5 ), the SEI-axioms (Table 6), the -axioms (Table 7) and the jF-axioms (Table 8) restricted to type P, it follows by induction on type formation, that the higher-order typed axioms hold in M. So, M is a model for combinatory... ..."

### Table 2: Summary of Model Algebra. A and B are individual terms and M is a set of terms.

2001

Cited by 4

### Table 3: The Algebra of Binary Flownomials

2001

"... In PAGE 28: ...t is a pleasure to thank Ch. Facchi for help in preparing the manuscript and R. Grosu for stimulating discussions on the algebra of stream processing functions. 7 Appendix: The Axioms Table3 lists the groups of axioms we were starting with. The adapted axioms for data ow networks are given in the previous two tables.... In PAGE 30: ...e., of terms written with ++ ; ; I; X and some constants in gt;; ?; _; ^), y : c ! d is in E and f : a + c ! b + c; g : a + d ! b + d are arbitrary Table3 : The Algebra of Binary Flownomials (continued)... ..."

### Table 1: Syntactic translation of the modeling process terms.

2004

"... In PAGE 7: ...actic expansion into its associated scope operator process term (see Section 2.3.2), which itself may consist of other modeling process terms. In Table1 , the modeling process terms are defined in terms of core process terms and SOS process terms by means of translation function T 2 .... ..."

### Table 1: Deterministic sampling using aBDD (static and dynamic)

1999

"... In PAGE 5: ...Experiment 1 ( Table1 , and Figure 2): First, we use the order computed by sampling to build the BDD statically. Except for slightly inferior orderings on c499 and c1355 (both circuits are functionally equiva- lent) we find that our methods always produce better variable orderings than those produced by DFS search based static techniques (Table 1).... In PAGE 5: ...Experiment 1 (Table 1, and Figure 2): First, we use the order computed by sampling to build the BDD statically. Except for slightly inferior orderings on c499 and c1355 (both circuits are functionally equiva- lent) we find that our methods always produce better variable orderings than those produced by DFS search based static techniques ( Table1 ). For many industrial examples we find that DFS-MIN cannot even process the circuits.... In PAGE 5: ... It is easy to see that window based sampling gives much better results than cube based methods. Interestingly, for EX3 and EX6, aBDD based methods can create a small BDD for the output function, but cube based sampling fails for some of the runs! Experiment 2 ( Table1 and Figure 3) show the utility of window based sampling in a dynamic vari- able ordering scheme. That is, we show how dynamic reordering techniques can be significantly improved if they are supplied with an initial variable ordering generated using a window based sampling technique.... In PAGE 5: ... That is, we show how dynamic reordering techniques can be significantly improved if they are supplied with an initial variable ordering generated using a window based sampling technique. In Table1 , we find that we can produce far smaller graphs than the traditional dynamic reordering meth- ods (sift, sift-convergence). Also, for most of the large circuits we take less time.... ..."

Cited by 1

### Table 4: Process Algebra Formalisms

1997

Cited by 2

### Table 1 In the section 4 we introduce two probabilistic models and describe both the above learning situations. The {algebra of the rst model is generated by the events T and P , while the one of the second model by the sole T , regarding to P and N as random variables. We will use both the notations of Table 1.

"... In PAGE 4: ... The assumptions of the (binary indepen- dence) probabilistic retrieval model are the following: a) the terms are stochastically independent, b) the term indexing in a document is binary (restricted to 0 and 1,irrespective of the frequency with which a term occurs in a document), c) the term dis- tribution in the set of relevant retrieved items is the same as the distribution in the complete set of relevant items, and d) all non{retrieved items can be treated as non{relevant e) docu- ments are in one{to{one correspondence with their syntactical representation. Under the previous assumptions, from the formula (1) we obtain the following measure of discrimi- nation power of a term [17, 22] (RSJ func- tion): w(t) = log rt (N ? R ? nt + rt) (R ? rt) (nt ? rt) (2) N is the number of documents in the col- lection, nt of which include the term t, R is the total number of relevant retrieved docu- ments, and rt is the number of these in which the term t occurs (see Table1 (b)). There are also the following three modi ed RSJ functions [17, 16] used as retrieval rule in the probabilistic model.... In PAGE 7: ... e. the con- stituents T \ P , T \ N, T \ N, T \ P , corre- sponding to the cells of the Table1 , and the value of P rob on the generic constituent C is given by P rob(C) = jCj jDj. In order to select the best theory for the observed data we use, as suggested in [10], the Bayesian decision principle, according to which one should strive to maximize expected utility.... In PAGE 8: ...Table1 (a), we get EUH(T ) = tp tp + fp ? 1 1 + ?tplogtp?fnlogfn+jPjlogjDj ?fplogfp?tnlogtn+jNjlogjDj By using the content K, (6) becomes EUK(T ) = P rob(P jT ) ? 1 1 + log[Prob(P)] log[Prob(N)] that is EUK(T ) = tp tp + fp ? 1 1 + logjPj?logjDj logjNj?logjDj Instead, if we use the content, we obtain EUC(T ) = P rob(P jT ) ? P rob(P ) that is EUC(T ) = tp tp + fp ? jP j jDj The last formula has its grounds in the stan- dard probabilistic analysis of causation which has been suggested by several authors [20, 8]. The basic idea is that a cause raises the prob- ability of the e ects.... In PAGE 8: ... In- deed, from EUujT we have W (T ) as follows P rob(P jT )P rob(NjT)?P rob(NjT )P rob(P jT) The choice of T produces a bene t whenever W (T ) gt; 0. That can be equivalently ex- pressed by W 1(T ) gt; 0, where W 1(T ) = log P rob(P jT )P rob(NjT) P rob(NjT )P rob(P jT) Writing the conditional probabilities with the notation of Table1 (b), we obtain exactly the RSJ function (2). Instead, if we use the conditional entropy H(ajb) = ? Pi P rob(Cijb)log[P rob(Cijb)] and the conditional measure K(ajb) = ?log P rob(ajb) in EUujT and EUujT , we ob- tain the following four ranking functions: EUHjT (T ) = tp tp + fp ? 1 1 + ?tplogtp+tplog(tp+fp) ?fplogfp+fplog(tp+fp) EUHjT (T ) = tp tp + fp ? 1 1 + ?fnlogfn+fnlog(tn+fn) ?tnlogtn+tnlog(tn+fn) and EUKjT (T ) = tp tp + fp ? 1 1 + ?logtp+log(tp+fp)... ..."