11

Wikipedia says "An unbiased random walk is non-ergodic."

Let's look at a simple random walk. It's defined as: take independent random variables $Z_{1},Z_{2}$, where each variable is either $1$ or $−1,$ with a 50% probability for either value, and set $S_{0}=0\,\!$ and $S_{n}=\sum _{j=1}^{n}Z_{j}$.

If we calculate (let's say) the mean mean for an ensemble of size $N$ it will be $(\sum _{j=1}^{n}Z_{j})/N$ and the mean for a single realisation of length $N$ will be exactly the same $(\sum _{j=1}^{n}Z_{j})/N.$

So, why is it non-ergodic?

whuber
  • 281,159
  • 54
  • 637
  • 1,101
Alex Craft
  • 427
  • 2
  • 11

1 Answers1

10

That Wikipedia article writes,

The process $X(t)$ is said to be mean-ergodic or mean-square ergodic in the first moment if the time average estimate $${\hat {\mu }}_{X}={\frac {1}{T}}\int _{0}^{T}X(t)\,\mathrm{d}t$$ converges in squared mean to the ensemble average $\mu _{X}$ as $T\rightarrow \infty.$

The problem is that $\hat\mu$ becomes more and more variable as $T$ increases. This becomes apparent when $X(t)$ is the discrete Binomial random walk described in the question, because the time average is

$$\hat\mu(X) = \frac{1}{T} \sum_{i=1}^T X(t) = \frac{1}{T} \sum_{i=1}^T \sum_{j=1}^i Z(i) = Z(1) + \frac{T-1}{T}Z(2) + \cdots + \frac{1}{T}Z(T).$$

Notice how the early terms persist: $Z(1)$ appears with coefficient $1$ and the coefficients of the subsequent $Z(i)$ converge to $1$ as $T$ grows. Their contributions to the time average therefore do not get averaged out and consequently the time average cannot converge to a constant.


In the context and notation of the Wikipedia article, let's prove this result by finding the mean and variance of the time average.

The expectation of $\hat{\mu}_X$ is

$$\mathbb{E}(\hat{\mu}_X) = {\frac {1}{T}}\int _{0}^{T}\mathbb{E}(X(t))\,\mathrm{d}t = \frac{1}{T}\int_0^T 0\, \mathrm{d}t = 0.$$

Therefore its variance is the expectation of its square,

$$\eqalign{ \operatorname{Var}(\hat{\mu}_X) &= \mathbb{E}\left(\hat{\mu}_X^2\right)\\ &= \mathbb{E}\left({\frac {1}{T}}\int _{0}^{T}\mathbb{E}(X(t))\,\mathrm{d}t \ {\frac {1}{T}}\int _{0}^{T}\mathbb{E}(X(s))\,\mathrm{d}s \right) \\ &= \left(\frac {1}{T}\right)^2 \int_0^T \int_0^T \mathbb{E}(X(t)X(s))\,\mathrm{d}t \mathrm{d}s \\ &= \left(\frac {1}{T}\right)^2 \int_0^T \int_0^T \min(s,t)\,\mathrm{d}t \mathrm{d}s \\ &= \left(\frac {1}{T}\right)^2 \int_0^T \left(\int_0^s t\,\mathrm{d}t + \int_s^T s\,\mathrm{d}t\right)\mathrm{d}s \\ &= \left(\frac {1}{T}\right)^2 \int_0^T \left(\frac{s^2}{2} + (T-s)s\right)\mathrm{d}s \\ &= \left(\frac {1}{T}\right)^2 \frac{T^3}{3} \\ &= \frac{T}{3}. }$$

Because this grows ever larger as $T$ grows, $\hat\mu_X$ cannot possibly converge to a constant as required by the definition of ergodicity--even though it has a constant average of zero. Whence Wikipedia writes (to quote the passage fully),

An unbiased random walk is non-ergodic. Its expectation value is zero at all times, whereas its time average is a random variable with divergent variance.

whuber
  • 281,159
  • 54
  • 637
  • 1,101
  • The analysis for the discrete (Binomial) random walk described in the question is identical: merely replace the integrals by sums. – whuber Dec 19 '19 at 17:04
  • Can you please explain how we get $\int \int \Bbb{E}(X(t)X(s)) dt ds = \int \int \min(t,s) dt ds$ – honeybadger Dec 19 '19 at 17:54
  • Thanks. One thing is still unclear to me. If we consider each step as `+1$` or `-1$` (dollars, money). The `average` or `expected value` will be defined slightly different, no? For ensemble it will be the average over the ensemble, but for single realisation it will be just the final value `X(N)` (we don't care about the process, only how much money we have in the end) - and the calculations for both will be the same `sum(X_i, 1..N)/N`? And the process will be ergodic, no? – Alex Craft Dec 19 '19 at 17:57
  • 1
    It don't follow that, Alexey (aka Alex Craft), because the definition of the expectation isn't at issue and applies to *any* random variable whatsoever. In a single realization there is no such thing as an expectation, but there is a *time average.* – whuber Dec 19 '19 at 18:24
  • 2
    @kasa (aka honeybadger) This is a basic property of Brownian motion. Part of the definition of Brownian motion is that the increment $X(t)-X(s)$ (for $t\gt s$) is independent of $X(s).$ It follows immediately that the covariance of $X(s)$ and $X(t)$ equals the variance of $X(s).$ Another defining property of Brownian motion is that the variance of $X(s)$ equals $s.$ The result follows. – whuber Dec 19 '19 at 18:26