### Table 3. Dependency to the model: Markov orders 0, 1 and 2. word Markov 0 Markov 1 Markov 2

### Table 3: Results of Markov Models

1994

"... In PAGE 3: ... So, Ri depends (indirectly) on the joint context of Ci and Ci+1, thus allowing syntactic anomalies to be detected.4 HHHHHH j * * HHHHHH j Ci Ci+1 Ci i Ci i P(wijCi) P(wi+1jCi+1) P(Ci+1jCi i) P( ijCi) P( ijCi) P(Ci+1jCi i) Figure 2: Markov Model of Repairs Table3 (Section 6.4) gives results for this simple model running on our training corpus.... In PAGE 5: ...4. Results Table3 summarizes the results of incorporating additional clues into the Markov model. The first column gives the results without any clues, the second with fragments, the third with editing terms, the fourth with word matches, and the fifth, with all of these clues incorporated.... ..."

Cited by 10

### Table 6 Markov Switching Model

"... In PAGE 22: ... If we included T gt;5000 in the design, we would eventually see the rejection frequency decrease. In Table6 we tabulate the empirical sizes of nominal 5% tests of d=1. Although the persistence of the Markov switching model is increasing in and , it turns out that it is p 00 p 11 nevertheless very easy to reject d=1 in this particular experimental design.... ..."

### Table 6 Markov Switching Model

"... In PAGE 22: ... If we included T gt;5000 in the design, we would eventually see the rejection frequency decrease. In Table6 we tabulate the empirical sizes of nominal 5% tests of d=1. Although the persistence of the Markov switching model is increasing in and , it turns out that it is p 00 p 11 nevertheless very easy to reject d=1 in this particular experimental design.... ..."

### Table 3: Results of Markov Models

"... In PAGE 3: ... So, Ri depends (indirectly) on the joint context of Ci and Ci+1, thus allowing syntactic anomalies to be detected.4 HHHHHH j * * HHHHHH j Ci Ci+1 Ci i Ci i P(wijCi) P(wi+1jCi+1) P(Ci+1jCi i) P( ijCi) P( ijCi) P(Ci+1jCi i) Figure 2: Markov Model of Repairs Table3 (Section 6.4) gives results for this simple model running on our training corpus.... In PAGE 5: ...4. Results Table3 summarizes the results of incorporating additional clues into the Markov model. The first column gives the results without any clues, the second with fragments, the third with editing terms, the fourth with word matches, and the fifth, with all of these clues incorporated.... ..."

### Table 5 Markov Switching Model

"... In PAGE 21: ... 999 } and . In Table5 we show the empirical sizes of T 0{ 100 , 200 , 300 , 400 , 500 , 1000 , 1500 , .... ..."

### Table 5 Markov Switching Model

"... In PAGE 21: ... 999 } and . In Table5 we show the empirical sizes of T 0{ 100 , 200 , 300 , 400 , 500 , 1000 , 1500 , .... ..."

### Table 4: 3rd order Markov Model

1996

"... In PAGE 17: ... Then qk x(ajb) estimates the state transition probability from state b to state (a; b1; :::; bk?1). The estimates of the state transition probabili- ties of the k-th order Markov chain are qk x(ajb) = 8 lt; : lk x(a;b) lk x(b) if lk x(b) gt; 0 0 otherwise (6) Table4 shows the estimated state transition probabilities of a 3rd order Markov model for the trace of receiver, alps. Since it is a 3rd order model, the next value depends on the 3 previous values and there are 8 different states, one state for each of the possible relevant histories.... In PAGE 19: ... Had any of the traces exhibited temporal dependencies of orders greater than 6, then, with very high probability, its conditional entropy would have continued to decrease beyond k = 3 and ^ k( ) would have estimated its order to be the maximum permitted value of K0. Looking at the state transition matrix in Table4 we observed that the probability of a 1 occurring in state 111 is 0.846.... ..."

Cited by 204

### Table 4: 3rd order Markov Model

1996

"... In PAGE 17: ... Then qk x(ajb) estimates the state transition probability from state b to state (a; b1; :::; bk?1). The estimates of the state transition probabili- ties of the k-th order Markov chain are qk x(ajb) = 8 lt; : lk x(a;b) lk x(b) if lk x(b) gt; 0 0 otherwise (6) Table4 shows the estimated state transition probabilities of a 3rd order Markov model for the trace of receiver, alps. Since it is a 3rd order model, the next value depends on the 3 previous values and there are 8 different states, one state for each of the possible relevant histories.... In PAGE 19: ... Had any of the traces exhibited temporal dependencies of orders greater than 6, then, with very high probability, its conditional entropy would have continued to decrease beyond k = 3 and ^ k( ) would have estimated its order to be the maximum permitted value of K0. Looking at the state transition matrix in Table4 we observed that the probability of a 1 occurring in state 111 is 0.846.... ..."

Cited by 204

### Table 2. Fourth order Markov model statistics.

2001

"... In PAGE 6: ... We could have chosen a66 to be larger than 4, but we did not want to significantly increase the complexity of the Markov model. Table2 shows the probabilities of the trace being in each state... ..."

Cited by 47