3

Let $P(K=k)=(1-\beta)^k\beta ; k=1,2,3,...$
Then it is required to show $\beta K$ converges in distribution to an exp (1) random variable as $\beta$ tends to zero.

For this they have started with considering $P(\beta K\ge x)$ . I was able to show it by considering this, but I don't understand why $P(\beta K\ge x)$ is considered in the first place?

Then, I need to show that $K$ tends to infinity as $\beta$ tends to zero. I don't understand how to prove this.

Also, what are good references that explain convergence in probability and distribution with examples.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
sam_rox
  • 535
  • 1
  • 5
  • 14
  • 2
    To say that $X$ is exponentially distributed with expected value $1$ is equivalent to saying $\Pr(X>x) = e^{-x}$ for $x\ge0.$ That's a somewhat simpler expression than that for $\Pr(X\le x).\qquad$ – Michael Hardy Mar 12 '17 at 02:51
  • 2
    You need $\Pr(K=k) = (1-\beta)^{k-1}\beta \text{ for } k = 1,2,3,\ldots;$ with $k$ rather than $k-1$ in that exponent, the probabilities will not add up to $1. \qquad$ – Michael Hardy Mar 12 '17 at 02:52
  • 1
    If $K$ is the number of independent trials needed to get one success, with probability $\beta$ of success on each trial, then the event $K\ge k$ is the same as the event of failure on the first $k-1$ trials; therefore $\Pr(K\ge k) = (1-\beta)^{k-1}. \qquad$ – Michael Hardy Mar 12 '17 at 02:55
  • @MichaelHardy Thank you for the explanation. This was from a article and they have obtained $E(K)={1\over\beta}$. For this to happen it should be $\Pr(K=k) = (1-\beta)^{k-1}\beta \text{ for } k = 1,2,3,\ldots;$ and not $P(K=k)=(1-\beta)^k\beta ; k=1,2,3,...$ – sam_rox Mar 12 '17 at 03:22

2 Answers2

1

One approach is to use moment generating functions (mgf). The mgf of your geometric random variable K is $$ \DeclareMathOperator{\E}{\mathbb{E}} M_K(t) = \E e^{tK} = \frac{\beta e^t}{1-(1-\beta) e^t} $$ Then the mgf of $\beta K$ can be found by $M_{\beta K}(t) = M_K(\beta t) = \frac{\beta e^{\beta t}}{1-(1-\beta) e^{\beta t}}$. Now when $\beta$ goes to zero, both numerator and denominator goes to zero, so you can use L'hopitals rule, and that gives you the limit when $\beta$ goes to zero as $\frac{1}{1-t}$ which is the mgf of an exponential random variable with rate 1.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
1

Assuming that $K$ takes on values $1,2,3,\ldots$ with $P\{X=k\} = (1-\beta)^{k-1}\beta$, $k > 0$, and not $(1-\beta)^{k}\beta$ as the problem states, then $P\{K > k\} = (1-\beta)^{k}$, either by

  • recognizing that $K$ is the number of repeated independent trials to have an event of probability $\beta$ occur for the first time, and so, $K > k$ if and only if the event did not occur on the first $k$ trials

or by

  • brute-force adding up \begin{align}P\{X \leq k\} &= \sum_{i=1}^k P\{K = i\}\\ &= \sum_{i=1}^k (1-\beta)^{i-1}\beta\\ &= \beta\big(1 + (1-\beta) + (1-\beta)^2 + \cdots + (1-\beta)^{k-1}\big)\\ &= \beta\cdot \frac{1-(1-\beta)^k}{1 -(1-\beta)}\\ &= 1-(1-\beta)^k \end{align} and so, $P\{K > k\} = 1 - P\{X \leq k\} = (1-\beta)^k$, as before.

$K$ is called a geometric random variable with parameter $\beta$.


For $n \geq 2$, let $K_n$ denote a geometric random variable with parameter $\frac 1n$ and define $X_n = \frac 1n K_n$. Note that we can think of $X_n$ as $\beta K$ where $K$ is a geometric random variable with parameter $\beta = \frac 1n$. We have that for any fixed positive real number $x$ $$P\{X_n > x\} = P\left\{\frac 1n K_n > x\right\} = P\{K_n > nx\}\approx \left(1 - \frac 1n\right)^{nx},$$ that is, $$\lim_{n\to\infty} P\{X_n > x\} = \lim_{n\to\infty}1 - F_{X_n}(x)= e^{-x}.$$ The sequence of random variables $X_n$ is thus converging in distribution to an exponential random variable with parameter $1$.

Dilip Sarwate
  • 41,202
  • 4
  • 94
  • 200