3

Consider a $t\in(0,1)$. Consider, for $\Delta>0$ the random variable $X_t^{(\Delta)}$ defined as

$$ \mathbb{P}[X_t^{(\Delta)}=1]=\left(1-\lambda\,\Delta\right)^{\left\lfloor t/\Delta\right\rfloor},\quad \mathbb{P}[X_t^{(\Delta)}=0]=1-\left(1-\lambda\,\Delta\right)^{\left\lfloor t/\Delta\right\rfloor}, $$ with $\lambda\in(0,1)$. Clearly when $\Delta\rightarrow 0$

$$ \mathbb{P}[X_t^{(\Delta)}=1]=\left(1-\lambda\,\Delta\right)^{\left\lfloor t/\Delta\right\rfloor}\rightarrow e^{-\lambda\,t} $$

and

$$ \mathbb{P}[X_t^{(\Delta)}=0]=1-\left(1-\lambda\,\Delta\right)^{\left\lfloor t/\Delta\right\rfloor}\rightarrow 1-e^{-\lambda\,t}. $$

The problem is: does the sequence $X_t^{(\Delta)}$ converge (in probability? distribution?) or not? My initial guess was that the sequence converges to a random variable $X_t$ that is $1$ with probability $e^{-\lambda\,t}$ and $0$ with probability $1-e^{-\lambda\,t}$, but it seems wrong since

\begin{eqnarray} \mathbb{P}\left[\left|X_t^{(\Delta)}-X_t\right|>\varepsilon\right] &=&\mathbb{P}\left[X_t^{(\Delta)}=1\right]\,\mathbb{P}\left[X_t=0\right]+ \mathbb{P}\left[X_t^{(\Delta)}=0\right]\,\mathbb{P}\left[X_t=1\right]\\ &=&\left(1-\lambda\,\Delta\right)^{\left\lfloor t/\Delta\right\rfloor}\,(1-e^{-\lambda\,t})+(1-\left(1-\lambda\,\Delta\right)^{\left\lfloor t/\Delta\right\rfloor})\,e^{-\lambda\,t}\\ &\rightarrow& 2\,e^{-\lambda\,t}\,(1-e^{-\lambda\,t})\neq 0 \end{eqnarray}

AlmostSureUser
  • 213
  • 1
  • 9
  • A `self-study` tag seems in order, no? – Xi'an Jan 26 '16 at 17:15
  • 1
    You seem unsure about the question you want to ask: what does "say in probability" actually mean? – whuber Jan 26 '16 at 17:25
  • 1
    Sorry, it was my bad english, I changed the question a little bit. My problem is to understand if the sequence converges somewhere in probability or in distribution. – AlmostSureUser Jan 26 '16 at 17:27
  • 1
    The decomposition in your final equation assumes $X_t$ and $X_t^{(\Delta)}$ are independent. For a convergence in probability, this is impossible. – Xi'an Jan 26 '16 at 17:34
  • 2
    For the convergence in distribution, the (positive) result is already written in your description. – Xi'an Jan 26 '16 at 17:35
  • 1
    @Xi'an I believe independence is not a requirement for convergence in probability to hold. But when a sequence of independent variables does converge in probability, we can say a lot about what it must converge to! :-) – whuber Jan 26 '16 at 18:45
  • I am a little bit confused, how can we prove that $X_t^{(\Delta)}$ and $X_t$ are not independent? – AlmostSureUser Jan 27 '16 at 08:45
  • For the convergence in probability as considered in the final equation of the OP, when $\Delta$ goes to infinity, $X_t^{(\Delta)}$ and $X_t$ must look more and more the same. Not only in distribution but for most realisations, that is $X_t^{(\Delta)}(\omega)\approx X_t(\omega)$ on a large set of $\omega$'s. – Xi'an Jan 27 '16 at 17:16

0 Answers0