### Citations

5196 | Probability and measure - Billingsley - 1979 |

2368 |
An Introduction to Probability Theory and its
- FELLER
- 1966
(Show Context)
Citation Context ...w that the sample autocovariance is a consistent estimator of the autocovariance, and asymptotically stable with tail index /2. Stable laws and processes are comprehensively treated in, e.g., Feller =-=[16]-=-, Samorodnitsky and Taqqu [29], and Meerschaert and Scheffler [25]. Let X̂(i)i+k = PHk,iXi+k denote the one-step predictors, where Hk,i = sp{Xi, . . . , Xi+k−1}, k1, and PHk,i is the orthogonal proje... |

1233 | Time series: theory and methods - Brockwell, Davis - 2006 |

948 |
Martingale limit theory and its application
- Hall, Heyde
- 1980
(Show Context)
Citation Context ...n, provided that k = k(N) increases to ∞ with N, lim N→∞ Var{tN,k(N)} = , (37) where is given in (16). Next we want to use the martingale central limit theorem (Theorem 3.2, p.58 in Hall and Heyde =-=[18]-=-) to show thats′tN,k ⇒ N (0,s′k) for a fixed k and anys∈ RD. Consider the triangular array of summands XN(j) = N−1/2′wjk , j = 0, . . . , N − 1, N = 1, 2, . . . . For each fixed k, it is sufficien... |

490 |
Stable non-Gaussian random processes
- Taqqu
- 1994
(Show Context)
Citation Context ...ce is a consistent estimator of the autocovariance, and asymptotically stable with tail index /2. Stable laws and processes are comprehensively treated in, e.g., Feller [16], Samorodnitsky and Taqqu =-=[29]-=-, and Meerschaert and Scheffler [25]. Let X̂(i)i+k = PHk,iXi+k denote the one-step predictors, where Hk,i = sp{Xi, . . . , Xi+k−1}, k1, and PHk,i is the orthogonal projection onto this space, which m... |

372 | Multiple Time Series - Hannan - 1970 |

162 |
Consistent autoregressive spectral estimates”,
- Berk
- 1974
(Show Context)
Citation Context ...the innovations algorithm) is slower than in the finite fourth moment case. Brockwell and Davis [13] discuss asymptotics of the innovations algorithm for stationary time series, using results of Berk =-=[8]-=- and Bhansali [10]. However, all of these results assume a finite fourth moment for the noise sequence. Hence our results seem to be new even in the stationary case when the period = 1. Since our te... |

143 |
Limit Distributions for Sums of Independent Random Vectors: Heavy Tails in Theory and Practice
- Meerschaert, Scheffler
- 2001
(Show Context)
Citation Context ...autocovariance, and asymptotically stable with tail index /2. Stable laws and processes are comprehensively treated in, e.g., Feller [16], Samorodnitsky and Taqqu [29], and Meerschaert and Scheffler =-=[25]-=-. Let X̂(i)i+k = PHk,iXi+k denote the one-step predictors, where Hk,i = sp{Xi, . . . , Xi+k−1}, k1, and PHk,i is the orthogonal projection onto this space, which minimizes the mean squared error vk,i... |

80 | A Probability Path. - Resnick - 1999 |

44 |
Prediction of Multivariate Time Series by Autoregressive Model Fitting,”
- Lewis, Reinsel
- 1985
(Show Context)
Citation Context ...result in the case where the noise sequence has finite fourth moments was obtained by Anderson and Meerschaert [4]. A similar result was obtained in the finite fourth moment case by Lewis and Reinsel =-=[21]-=- for vector autoregressive models, however, the prediction problem here is different. For example, suppose that (2) represents monthly data with = 12. For a periodically stationary model, the predic... |

42 |
Sur l’extension du theoreme limite du calcul des probabilites aus sommes de quantites dependantes.
- Bernstein
- 1927
(Show Context)
Citation Context ...ces′k. Then an application of the Cramér–Wold device [12, p. 48] yields tN,k ⇒ N (0,k). To extend the central limit theorem to the case where k = k(N) → ∞ as N → ∞ we use a result due to Bernstein =-=[9]-=- that we refer to as Bernstein’s Lemma,which is proved inHannan [19, p. 242]. Let xN be a sequence of vector valued random variables with zero mean such that for every > 0, > 0,s> 0 there exist se... |

21 | Approaches to multivariate modeling of water resources time series, - Salas, Tabios, et al. - 1985 |

18 | Recursive prediction and likelihood evaluation for periodic ARMA models,
- Lund, I
- 2000
(Show Context)
Citation Context ...-stationary process. Writing X̂ (i) i+k = k∑ j=1 (i)k,j (Xi+k−j − X̂(i)i+k−j ) (8) yields the one-step predictors in terms of the innovations Xi+k−j − X̂(i)i+k−j . Proposition 4.1 of Lund and Basawa =-=[22]-=- shows that if 2i > 0 for i = 0, . . . , −1, then for a causal PARMA(p, q) process the covariance matrix k,i is non-singular for every k1 and each i. Anderson et al. [5] show that if EXt = 0 and ... |

17 | Periodic moving average of random variables with regularly varying tails. The Annals of Statistics 24
- Anderson, Meerschaert
- 1997
(Show Context)
Citation Context ...quence has finite second moment but infinite fourth moment. This case is important in applications to river flows, see for example Anderson and Meerschaert [3]. In that case, Anderson and Meerschaert =-=[2]-=- proved that the sample autocovariances, the basis for the innovations algorithm estimates of the model parameters, are asymptotically stable. Surprisingly, the innovations estimates themselves turn o... |

17 | Periodic autoregressive-moving average (parma) modeling with applications to water resources1 - Vecchia - 1985 |

16 | Asymptotic results for periodic autoregressive movingaverage processes - Anderson, Vecchia - 1993 |

16 | Climatological time series with periodic correlation - Lund, Hurd, et al. - 1995 |

15 |
Parameter estimation for periodic ARMA models
- Adams, Goodwin
- 1995
(Show Context)
Citation Context ... (0) = 1 and∑∞j=0 | t (j)| < ∞ for all t. We will say that the i.i.d. noise sequencest = −1t εt is RV() if P [| t | > x] varies regularly with index − and P [ t > x]/P [| t | > x] → p for some p ∈ =-=[0, 1]-=-. The case where E| t |4 < ∞ was treated in Anderson and Meerschaert [4]. In this paper, we assume that the noise sequence { t } is RV() for some 2 < < 4. This 96 P.L. Anderson et al. / Journal of ... |

14 |
Linear prediction by autoregressive model fitting in the time domain
- Bhansali
- 1978
(Show Context)
Citation Context ...lgorithm) is slower than in the finite fourth moment case. Brockwell and Davis [13] discuss asymptotics of the innovations algorithm for stationary time series, using results of Berk [8] and Bhansali =-=[10]-=-. However, all of these results assume a finite fourth moment for the noise sequence. Hence our results seem to be new even in the stationary case when the period = 1. Since our technical approach e... |

14 | Testing for periodic autocorrelations in seasonal time series data. - Vecchia, Ballerini - 1991 |

11 |
Simple Consistent Estimation of the Coefficients of a Linear Filter
- Brockwell, Davis
- 1988
(Show Context)
Citation Context ... to be asymptotically normal, although the rate of convergence (in terms of the number of iterations of the innovations algorithm) is slower than in the finite fourth moment case. Brockwell and Davis =-=[13]-=- discuss asymptotics of the innovations algorithm for stationary time series, using results of Berk [8] and Bhansali [10]. However, all of these results assume a finite fourth moment for the noise seq... |

11 | On periodic and multiple - Pagano - 1978 |

11 | Forecasting of multivariate periodic autoregressive moving-average processes. - Ula - 1993 |

10 | Parameter estimation for periodically stationary time series,
- Anderson, Meerschaert
- 2005
(Show Context)
Citation Context ...models are developed by many authors including [1,2,4–7,20,22–24,26,28,30,31,33–41]. Anderson et al. [5] develop the innovations algorithm for periodic ARMA model parameters. Anderson and Meerschaert =-=[4]-=- develop the asymptotics necessary to determine which of these estimates are statistically different from zero, under the classical assumption that the noise sequence has finite fourth moment. In this... |

10 | Innovations algorithm for periodically stationary time series, Stochastic Process.
- Anderson, Meerschaert, et al.
- 1999
(Show Context)
Citation Context ...ent to showing that N1/2((〈i−−k〉)k,u − (〈i−k〉)k−,u ) → 0 (50) and N1/2(̂(〈i−−k〉)k,u − ̂(〈i−k〉)k−,u ) P→ 0 (51) for = 1, . . . , D − 1 and u = 1, . . . , D. Using estimates from the proof of =-=[5]-=- Corollary 2.2.4 and condition (14) of Theorem 3.1, it is not hard to show that N1/2((〈i−k〉)k,u +si (u)) → 0 as N → ∞ for any u = 1, . . . , k. This leads to N1/2((〈i−−k〉)k,u +si−(u)) → 0 (52) by ... |

8 | Computation and characterization of autocorrelations and partial autocorrelations in periodic ARMA models - Shao, Lund - 2004 |

8 | Empirical identification of multiple time series - Tjostheim, Paulsen - 1982 |

7 | Periodic stationarity conditions for periodic autoregressive moving average processes as eigenvalue problems,” - Ula, Smadi - 1929 |

5 | Identification of PARMA models and their application to the modeling of river flows - Tesfaye, Meerschaert, et al. - 2006 |

4 | Maximum likelihood estimation for periodic moving average models - Vecchia - 1985 |

3 | Identification of periodic moving average models - Ula, Smadi - 2003 |

3 | Aggregation and estimation of loworder periodic ARMA models, - Vecchia, Obeysekera, et al. - 1983 |

2 |
Modeling river flows with heavy tails, Water Resources Res
- Anderson, Meerschaert
- 1998
(Show Context)
Citation Context ...those results to the case where the noise sequence has finite second moment but infinite fourth moment. This case is important in applications to river flows, see for example Anderson and Meerschaert =-=[3]-=-. In that case, Anderson and Meerschaert [2] proved that the sample autocovariances, the basis for the innovations algorithm estimates of the model parameters, are asymptotically stable. Surprisingly,... |

2 | Time series with periodic structure, Biometrika 54 - Jones, Brelsford - 1967 |

2 | Some results in periodic autoregression, Biometrika 6 - Troutman - 1979 |

1 | Large sample properties of parameter estimates from periodic ARMA models - Basawa, Lund |

1 | Modeling for periodically correlated time series - Lund, Basawa - 1999 |