6

What is meant by existence of a (discrete-time) stochastic process?
How do I know whether a process exists or not?
Could anyone offer a simple example of an existent and another of a nonexistent process?

These questions arose when discussing in the thread "Need for existence of stochastic processes behind models of conditional variance".

Richard Hardy
  • 54,375
  • 10
  • 95
  • 219
  • Intuitively, I am guessing that a stochastic process can be shown to exist if it can be expressed as a function of one or more stochastic processes that were already established to exist in the literature (e.g., simple random walk). If you are unable to find such an expression, you can cast doubt on the existence of such a process. The process may exist, but the underlying expression may be (1) tractable but currently unknown or (2) untractable. Or the process may not exist. – Isabella Ghement May 11 '19 at 22:45
  • Of course, it seems a bit of a stretch to conclude with certainty that just because you cannot find such an expression, that implies the non-existence of the process. – Isabella Ghement May 11 '19 at 22:46
  • @IsabellaGhement, but how do I even approach the notion of existence? What does it mean? – Richard Hardy May 12 '19 at 10:29
  • 2
    This comment makes for some light reading: https://stats.stackexchange.com/questions/13320/meaning-of-the-existence-proof/13330#13330 – Isabella Ghement May 12 '19 at 14:59
  • 2
    The Springer book "Basics of Applied Stochastic Processes" by Richard Serfozo states the following in its Section 6.8 Existence of Stochastic Processes: "A stochastic process is commonly defined by designating a set of properties that its distribution and sample paths must satisfy. The existence of the process amounts to showing that there exist a probability space and functions on it (the sample paths) that satisfy the designated properties of the process. This section describes Kolmogorov’s theorem that is used for such a task." – Isabella Ghement May 12 '19 at 15:12
  • 1
    @IsabellaGhement, thank you, these points are helpful. It is a pity working with existence seems rather complicated as it requires advanced maths, at least in continuous time. I wonder if perhaps the discrete-time case is easier? – Richard Hardy May 12 '19 at 15:33
  • These course notes seem to have a nice, comprehensive discussion of existence results: https://www.stat.cmu.edu/~cshalizi/almost-none/v0.1.1/almost-none.pdf. You are right that the mathematics machinery seems really complicated - certainly not something I have to deal with as an applied statistician. But it is an intriguing question, so it was interesting to find some resources that address it. – Isabella Ghement May 12 '19 at 15:54
  • 1
    @IsabellaGhement, If you collected information from your comments into an answer, it would address the first and partly the second question out of the three. I would happily upvote such an answer. – Richard Hardy May 12 '19 at 16:26
  • Thanks, Richard! I'll leave these as comments, as others on this forum may be more qualified to answer. – Isabella Ghement May 12 '19 at 20:07
  • [An answer I wrote to a different question](https://stats.stackexchange.com/a/449003/919) might give you a sense of what's involved in establishing existence. To be rigorous, we have to exhibit the mathematical objects demanded of the axioms of probability: a set of outcomes (sample paths), a collection of measurable events (which generate a sigma field), and a valid probability function on those events. That thread concerns a discrete stochastic process in discrete time; namely, a Markov chain. It explicitly connects the description in terms of transition probabilities to this abstract one. – whuber Feb 11 '20 at 16:12
  • An example of non-existent stochastic process: a non-constant martingale taking its values in $\{0,1\}$. – Stéphane Laurent Feb 18 '20 at 10:42

2 Answers2

1

In this answer I collect and summarize some bits of insight I have received through comments and gathered myself. Considerable credit goes to @IsabellaGhement and @whuber

What is meant by existence of a (discrete-time) stochastic process?

A stochastic process exists if the relevant mathematical objects demanded of the axioms of probability exist:

  1. a set of outcomes (sample paths),
  2. a collection of measurable events (which generate a sigma field), and
  3. a valid probability function on those events.

How do I know whether a process exists or not?

To show existence of a stochastic process, the objects listed above have to be exhibited.

How come some processes do not exist?

A stochastic process is commonly defined by designating a set of properties that its distribution and sample paths must satisfy. If these properties are contradictory, no triplet of the form (outcomes, sigma field, probability function) can satisfy them, hence the process does not exist.

Could anyone offer a simple example of an existent and another of a nonexistent process?

I am still looking for such simple examples of processes that do exist and these that do not. (I suppose it is not too difficult to construct a nonexisting process and show why it is such. The interesting part is to find an example where the contradiction leading to nonexistence is not immediately obvious, so that the example is pedagogically useful.)

Richard Hardy
  • 54,375
  • 10
  • 95
  • 219
  • In a comment, I gave a rather simple example of non-existent stochastic process: a non-constant martingale taking the values $0$ or $1$. – Stéphane Laurent Feb 18 '20 at 10:45
1

Could anyone offer a simple example of an existent process?

A well-known theorem which guarantees the existence of a stochastic process ${(X_n)}_{n \geqslant 1}$, say with $\mathbb{R}$-valued random variables $X_n$, is the Daniell-Kolmogorov extension theorem. A typical application of this theorem gives the existence of a sequence of independent random variables ${(X_n)}_{n \geqslant 1}$, with $X_n$ following any probability law (possibly depending on $n$).

Here is the statement of this theorem. Denote by $\mathcal{B}_n$ the Borel $\sigma$-field on $\mathbb{R}^n$. Suppose that for every $n \geqslant 1$ we have a probability measure $\mu_n$ on $(\mathbb{R}^n, \mathcal{B}_n)$. Suppose that the sequence of probability measures ${(\mu_n)}_{n \geqslant 1}$ is consistent, in the sense that $\mu_{n+1}(A \times \mathbb{R}) = \mu_n(A)$ for every $n \geqslant 1$ and every $A \in \mathcal{B}_n$. Then the theorem asserts that there exists a probability measure $\mu$ on $(\mathbb{R}^\mathbb{N}, \mathcal{B}_\infty)$ which extends all the $\mu_n$, in the sense that $\mu(A \times \mathbb{R}^\mathbb{N}) = \mu_n(A)$ for every $n \geqslant 1$ and every $A \in \mathcal{B}_n$.

Let's see how to apply this theorem to show the existence of a sequence of independent random variables ${(X_n)}_{n \geqslant 1}$ with $X_n \sim \nu_n$, where ${(\nu_n)}_{n \geqslant 1}$ is a given sequence of probability measures on $\mathbb{R}$. One takes the product measure $\mu_n = \nu_1 \otimes \cdots \otimes \nu_n$ for every $n \geqslant 1$. Then the consistency condition of ${(\mu_n)}_{n \geqslant 1}$ is easy to check. Then the Daniell-Kolmogorov extension theorem provides a probability measure $\mu$ on $\mathbb{R}^\mathbb{N}$ which extends the $\mu_n$. Take the probability space $$ (\Omega, \mathcal{A}, \mathbb{P}) = (\mathbb{R}^\mathbb{N}, \mathcal{B}_\infty, \mu). $$ An element $\omega$ of $\Omega$ is a sequence of real numbers $(\omega_1, \omega_2, \ldots)$. Then it suffices to define for each $n \geqslant 1$ the random variable $X_n$ on $(\Omega, \mathcal{A}, \mathbb{P})$ by $X_n(\omega) = \omega_n$. In other words the random sequence ${(X_n)}_{n \geqslant 1}$ is a $\mathbb{R}^\mathbb{N}$-valued random variable whose probability distribution is $\mu = \nu_1 \otimes \nu_2 \otimes \cdots$. The theorem guarantees the existence of this infinite product measure.

Could anyone offer a simple example of a nonexistent process?

I like this example: there does not exist a "non-trivial" martingale ${(M_n)}_{n \geqslant 1}$ such that $M_n$ takes its values in $\{0,1\}$ for every $n \geqslant 1$. Indeed that would mean that $M_n = \mathbf{1}_{A_n}$ for a certain event $A_n$, for every $n \geqslant 1$. The martingale condition is $$ \mathbb{E}[M_{n+1} \mid M_1, \ldots, M_n] = \mathbb{E}[M_{n+1} \mid M_n] = M_n. $$

We have $$ \begin{align} \mathbb{E}\bigl[(\mathbf{1}_{A_{n+1}}-\mathbf{1}_{A_{n}})^2\bigr] & =\mathbb{E}[\mathbf{1}_{A_{n+1}}^2]+\mathbb{E}[\mathbf{1}_{A_{n}}^2]- 2\mathbb{E}[\mathbf{1}_{A_{n+1}}\mathbf{1}_{A_{n}}] \\ & =\mathbb{E}[\mathbf{1}_{A_{n+1}}]+\mathbb{E}[\mathbf{1}_{A_{n}}]- 2\mathbb{E}[\mathbf{1}_{A_{n+1}}\mathbf{1}_{A_{n}}]. \end{align} $$ But $$ \mathbb{E}[\mathbf{1}_{A_{n+1}}] = \mathbb{E}\bigl[\mathbb{E}[\mathbf{1}_{A_{n+1}} \mid \mathbf{1}_{A_{n}}]\bigr] = \mathbb{E}[\mathbf{1}_{A_{n}}] $$ and $$ \mathbb{E}[\mathbf{1}_{A_{n+1}}\mathbf{1}_{A_{n}}] = \mathbb{E}\bigl[\mathbb{E}[\mathbf{1}_{A_{n+1}}\mathbf{1}_{A_{n}} \mid \mathbf{1}_{A_{n}}]\bigr] = \mathbb{E}\bigl[\mathbf{1}_{A_{n}}\mathbb{E}[\mathbf{1}_{A_{n+1}} \mid \mathbf{1}_{A_{n}}]\bigr] = \mathbb{E}[\mathbf{1}_{A_{n}}^2] = \mathbb{E}[\mathbf{1}_{A_{n}}]. $$ Finally, $\mathbb{E}\bigl[(\mathbf{1}_{A_{n+1}}-\mathbf{1}_{A_{n}})^2\bigr] = 0$, which means that $A_{n+1} = A_n$ (almost surely): our martingale is "trivial".

Stéphane Laurent
  • 17,425
  • 5
  • 59
  • 101