2

Let's say we have random variables $\mathbf{X}$, and we have $P(\mathbf{X}\in [a, b])=1$, we have $\mathbf{S}_n = \mathbf{X}_1 + \mathbf{X}_2, +\dots + \mathbf{X}_n$.

If $\mathbf{X}_1, \mathbf{X}_2, \dots, \mathbf{X}_n$ are independent, I believe we have: $$ \mathbb{E}[\mathbf{S}_n - \mathbb{E}[\mathbf{S}_n]] = \sum_{i=1}^n\mathbb{E}[\mathbf{X}_i-\mathbb{E}[\mathbf{X}_i]] $$ because I saw this as one step in the proof of Hoeffding's inequality. For example, see here.

Can anyone help me understand why we can get this equation?

And what if $\mathbf{X}_1, \mathbf{X}_2, \dots, \mathbf{X}_n$ are not independent. Let's say we only have the first $m$ out of these $n$ variables to be independent, can we get something like: $$ \sum_{i=1}^n\mathbb{E}[\mathbf{X}_i-\mathbb{E}[\mathbf{X}_i]] \leq \mathbb{E}[\mathbf{S}_n - \mathbb{E}[\mathbf{S}_n]] \leq \sum_{i=1}^m\mathbb{E}[\mathbf{X}_i-\mathbb{E}[\mathbf{X}_i]] $$

gunes
  • 49,700
  • 3
  • 39
  • 75

1 Answers1

2

Independent or not, the equality always holds and also equal to $0$:

$$\begin{align}E[S_n-E[S_n]]&=E[S_n]-E[E[S_n]]=E[S_n]-E[S_n]=0\end{align}$$

Similarly, $$\sum_{i=1}^n E[X_i-E[X_i]]=\sum_{i=1}^n (E[X_i] - E[X_i])=0$$

gunes
  • 49,700
  • 3
  • 39
  • 75
  • This is actually what I suspected, but then I don't understand why Hoeffding's inequality only applies to independent variables. This equation seems to be the only place when the proof of Hoeffding's inequality dealing with independence (expanding $\mathbf{S}_n$). –  May 21 '20 at 22:03
  • 1
    I don't see your expression in the proof in wikipedia. The expected value there have exponential term and $E[e^X]\neq e^{E[X]}$ – gunes May 21 '20 at 22:36