5

I read that one condition for weak stationarity is that a series needs to have a constant mean.

My (rather short) question: Does weak stationarity require the variance to be constant as well or is having a finite variance sufficient (or does one imply the other)?

This distinction not seem to be treated in a uniform manner in the literature I encountered. (However, maybe it is me who overlooks something obvious)

Kuma
  • 437
  • 2
  • 7
  • 18

2 Answers2

3

Yes, weak stationarity requires both constant variance and constant mean (over time). To quote from wikipedia: A wide-sense stationary random processes only require that 1st moment (i.e. the mean) and autocovariance do not vary with respect to time.

Digio
  • 2,427
  • 12
  • 18
  • 2
    Quick follow-up question: Does an autocovariance that does not vary with respect to time already imply a constant variance? Or is the "constant variance" only fulfilled by the two conditions (autocov. and mean)? – Kuma Jul 21 '17 at 08:22
  • The condition formally states that the autocovariance function must depend on the lag and not on time, this is equivalent to say that [variance is constant over time](https://stats.stackexchange.com/questions/119845/if-a-time-series-is-second-order-stationary-does-this-imply-it-is-strictly-stat). – Digio Jul 21 '17 at 08:38
  • 1
    Thanks for the explanation and the link, I was somewhat confused there. – Kuma Jul 21 '17 at 09:05
3

To give another view than that of Digio, I have actually only encountered the requirement for a finite second moment¹, and not for a constant one; at least in books and academic papers, as opposed to online resources (presentations, blogposts, etc.).

I thus believe the formal definition for a weak (or wide-sense) stationary process is:

  1. The first moment of $x_i$ is constant; i.e. $∀t, E[x_i]=$
  2. The second moment of $x_i$ is finite for all $t$; i.e. $∀t, E[x_i²]<∞$ (which also implies of course $E[(x_i-)²]<∞$; i.e. that variance is finite for all $t$)
  3. The cross moment ― i.e. the auto-covariance ― depends only on the difference $u-v$; i.e. $∀u,v,\tau, cov(x_u, x_v)=cov(x_{u+\tau}, x_{v+\tau})$

However, I believe that the apparent confusion between the two conditions, and the fact that in some places a requirement for constant variance is stated instead of a finite one, is due to the fact that this indeed follows directly from the three conditions above.

The third condition implies that every lag $\tau \in \mathbb{N}$ has a constant covariance value associated with it:

$$cov(X_{t_1}, X_{t_2}) = K_{XX}(t_1,t_2) = K_{XX}(t_2-t_1,0) = K_{XX}(\tau)$$

Note that this directly implies that the variance of the process is also constant, since we get that for all $t \in \mathbb{N}$

$$ Var(X_t) = cov(X_t, X_t) = K_{XX}(t,t) = K_{XX}(0) = d$$

for some constant $d$.

1 When writing second moment I mean $E[x_i^2]$, and not variance, which is the second central moment.

ShayPal5
  • 171
  • 3