2

I am currently working through the paper Improved Inference and Estimation in Regression With Overlapping Observations [1] which presents an elegant way to do inference on $\beta$ in a linear regression in the presence of overlapping observations: $$Ar = X\beta + u$$ where $r \in \mathbb{R}^{T}$ is a vector of one period log returns, $A \in \mathbb{R}^{(T-k+1) \times T}$ is a transformation matrix with ones on the main diagonal and the first k-1 right off-diagonals, so that $Ar$ are the overlapping returns, $X \in \mathbb{R}^{(T-k+1)\times l}$ is the matrix of explanatory variables with the first column of $X$ consisting of ones.

The paper then presents a transformed regression which allows for correct inference on $\beta$ in that the $\beta$ of the transformed regression will be the same as in the original regression above, but the standard errors of the coefficients will be corrected for the autocorrelation induced by the overlapping returns.

Taking a broader look at the literature ([2], [3], [4]) it is my understanding that in general one is concerned about standard error of the coefficients and thus the significance of the regression parameters. However, especially [2] and [4] are also reporting on the $R^2$ that is inflated by the overlapping returns.

There are two questions that come to my mind:

1.) Is there a way to modify or extend the approach in [1] described above so that the $R^2$ can be adjusted? Is that question even reasonable? Does the $R^2$ of the transformed regression tell one anything?

2.) In [2] the high $R^2$ of regressions with overlapping observations is pointed out in the context of simple linear regressions. For a simple linear regression, the $R^2$ is the correlation coefficient squared. Can in the case of a simple linear regression $R^2$ be adjusted based on standard error on the coefficient?


[1] Britten-Jones, Mark, Neuberger, Anthony and Nolte, Ingmar (2011) Improved inference and estimation in regression with overlapping observations. Journal of Business Finance & Accounting, Vol.38 (No.5-6). pp. 657-683.

[2] Asset management: A systematic approach to factor investing, Andrew Ang, Oxford University Press, 2014.

[3] Boudoukh, Jacob and Israel, Ronen and Richardson, Matthew P., Long Horizon Predictability: A Cautionary Tale (March 17, 2018)

[4] Jacob Boudoukh, Matthew Richardson, Robert F. Whitelaw, The Myth of Long-Horizon Predictability, The Review of Financial Studies, Volume 21, Issue 4, July 2008, Pages 1577–1605, https://doi.org/10.1093/rfs/hhl042

Richard Hardy
  • 54,375
  • 10
  • 95
  • 219
castle
  • 143
  • 3
  • Your question might be a duplicate of mine: ["$R^2$ and adjusted $R^2$ in presence of overlapping observations"](https://stats.stackexchange.com/questions/432173).(but I cannot mark it as such since mine does not have an answer). – Richard Hardy Mar 06 '20 at 18:05
  • I do not believe it is, as the adjusted $R^2$ in my question solely refers to an adjustment for overlapping observations, not necessarily to an adjustment for the number of predictors as in your question as would be conventional. – castle Mar 10 '20 at 21:06
  • I am afraid you misread my question. It is exactly about adjusting for overlapping observations, not the number of predictors. – Richard Hardy Mar 10 '20 at 21:16

0 Answers0