This question comes after reading: https://spuriousregression.com/the-dangers-of-time-series-unit-roots/
Consider the humble one-dimensional random walk process: $$ X_t = X_{t-1} + \epsilon_t $$ Where $\epsilon_t$ is drawn i.i.d. from some distribution with zero mean and finite, constant variance. This process is well known to be non-stationary - the variance is proportional to time. I have the following questions:
- If we consider the regression $X_t = \beta_1X_{t-1} + \epsilon_t$, and the OLS estimator $\hat{\beta_1}$, which assumption of the Gauss-Markov theorem is violated? I assume it is strict endogeneity $(\mathbb{E}[\epsilon_s | X_1, \cdots, X_T] = 0)$ which is violated, but I am not sure how to prove it. Is the only consequence of this a biased-estimate of $\beta_1$ ? (E.g. do we keep things like consistency?)
- Is there a reason that such a regression would give us spurious results (Inflated T-statistics and R-Square)? I have of course seen some works metion Granger's work on this, and how the statistics converge to functionals of Brownian motion in the limit. As a follow-up to this, if we consider the cousin of the random walk, the $AR(1)$ process: $$ X_t = \rho X_{t-1} + \epsilon_t $$ with $|\rho| < 1$, then this process is well-behaved (stationary) and in particular, the estimator is consistent. Does the AR(1) process violate "fewer" OLS assumptions than the random walk as a result of its stationarity? Which assumptions does it keep intact that the random walk does not?
Thank you for all of your help!