Let's say we have random variables $\mathbf{X}$, and we have $P(\mathbf{X}\in [a, b])=1$, we have $\mathbf{S}_n = \mathbf{X}_1 + \mathbf{X}_2, +\dots + \mathbf{X}_n$.
If $\mathbf{X}_1, \mathbf{X}_2, \dots, \mathbf{X}_n$ are independent, I believe we have: $$ \mathbb{E}[\mathbf{S}_n - \mathbb{E}[\mathbf{S}_n]] = \sum_{i=1}^n\mathbb{E}[\mathbf{X}_i-\mathbb{E}[\mathbf{X}_i]] $$ because I saw this as one step in the proof of Hoeffding's inequality. For example, see here.
Can anyone help me understand why we can get this equation?
And what if $\mathbf{X}_1, \mathbf{X}_2, \dots, \mathbf{X}_n$ are not independent. Let's say we only have the first $m$ out of these $n$ variables to be independent, can we get something like: $$ \sum_{i=1}^n\mathbb{E}[\mathbf{X}_i-\mathbb{E}[\mathbf{X}_i]] \leq \mathbb{E}[\mathbf{S}_n - \mathbb{E}[\mathbf{S}_n]] \leq \sum_{i=1}^m\mathbb{E}[\mathbf{X}_i-\mathbb{E}[\mathbf{X}_i]] $$