I am looking for some probability inequalities for sums of unbounded random variables. I would really appreciate it if anyone can provide me some thoughts.
My problem is to find an exponential upper bound over the probability that the sum of unbounded i.i.d. random variables, which are in fact the multiplication of two i.i.d. Gaussian, exceeds some certain value, i.e., $\mathrm{Pr}[ X \geq \epsilon\sigma^2 N] \leq \exp(?)$, where $X = \sum_{i=1}^{N} w_iv_i$, $w_i$ and $v_i$ are generated i.i.d. from $\mathcal{N}(0, \sigma)$.
I tried to use the Chernoff bound using moment generating function (MGF), the derived bound is given by:
$\begin{eqnarray} \mathrm{Pr}[ X \geq \epsilon\sigma^2 N] &\leq& \min\limits_s \exp(-s\epsilon\sigma^2 N)g_X(s) \\ &=& \exp\left(-\frac{N}{2}\left(\sqrt{1+4\epsilon^2} -1 + \log(\sqrt{1+4\epsilon^2}-1) - \log(2\epsilon^2)\right)\right) \end{eqnarray}$
where $g_X(s) = \left(\frac{1}{1-\sigma^4 s^2}\right)^{\frac{N}{2}}$ is the MGF of $X$. But the bound is not so tight. The main issue in my problem is that the random variables are unbounded, and unfortunately I can not use the bound of Hoeffding inequality.
I will be to happy if you help me find some tight exponential bound.
There are some results in the communications systems literature that say that in some cases Chernoff bounds are exponentially tight, and if these are applicable here, then the search for a tighter exponential bound than Chernoff would be futile. – Dilip Sarwate Oct 04 '11 at 18:57