首页    期刊浏览 2025年02月21日 星期五
登录注册

文章基本信息

  • 标题:Rejoinder of “High-dimensional autocovariance matrices and optimal linear prediction”
  • 本地全文:下载
  • 作者:Timothy L. McMurry ; Dimitris N. Politis
  • 期刊名称:Electronic Journal of Statistics
  • 印刷版ISSN:1935-7524
  • 出版年度:2015
  • 卷号:9
  • 期号:1
  • 页码:811-822
  • DOI:10.1214/15-EJS1000REJ
  • 语种:English
  • 出版社:Institute of Mathematical Statistics
  • 摘要:We would like to sincerely thank all discussants for their kind remarks and insightful comments. To start with, we wholeheartedly welcome the proposal of Rob Hyndman for a “better acf” plot based on our vector estimator ˆ γ ∗ (n) from Section 3.2. As mentioned, the sample autocovariance is not a good estimate for the vector γ(n), and this is especially apparent in the wild excursions it takes at higher lags—see the left panel of Figure 1 of Hyndman’s piece. Note that these wild (and potentially confusing) excursions are the norm rather than the exception; they are partly explainable by two facts: (a) the identity implies that ˘ γ k must misbehave for higher lags to counteract its good behavior for small lags; and (b) the ˘ γ k are correlated, and therefore their excursions appear smooth (and may be confused for structure). The only saving point of the current acf plot in R is that it has a lag.max default of 10log 10 n so the ugliness occuring at higher lags is masked. Interestingly, showing just the lags up to 10log 10 n is tantamount to employing a rectangular lag-window—which is one of the flat-top kernels albeit not the best—with a logarithmic choice for l that is indeed optimal under the exponential decay of γ k typical of ARMA models.
国家哲学社会科学文献中心版权所有