3

1. For a sequence of random variables $V_n$, and a deterministic sequence $b_n$, does

$$ \frac{V_n}{b_n} \overset{a.s.}{\to} c \quad \left( \implies \frac{V_n}{b_n} \overset{P}{\to} c \right) $$

for some deterministic constant $c$ imply in turn that :

$$ \lim_{n \to \infty}\frac{\mathbb{E}V_n}{b_n} = c \,? $$

2. Does $$ \lim_{n \to \infty} \frac{\mathbb{E} V_n}{b_n} = c \quad \implies \quad \mathbb{E} V_n = b_n + \Theta(c) \,? $$

Note: The specific $V_n$ I have in mind is $V_n := \max_{1 \le i \le n} X_i$ for $X_i$ i.i.d. $\mathscr{N}(0,1)$, and the specific $b_n$ I have in mind is $b_n = \sqrt{2 \log n}$. But that doesn't seem relevant to answering the question.

Chill2Macht
  • 5,639
  • 4
  • 25
  • 51

2 Answers2

3

The answer to part 1 is no.

Note that you can just write $W_n=\frac{V_n}{b_n}$, and you are now asking whether a.s. convergence to a constant implies $L^1$ convergence. A classical counter-example is to take $U\sim U(0, 1)$ and $$W_n=\begin{cases}n \text{ if } U<\frac{1}{n}\\ 0 \text{ otherwise} \end{cases}$$ so that $W_n\xrightarrow{a.s.} 0$ but $E[W_n]=1$.

Robin Ryder
  • 1,787
  • 1
  • 11
  • 16
  • Thank you for the sanity check; I wanted it to be true but it felt fishy to me for some reason, and now I know why. Does the statement possibly hold if we have some concentration around the mean? I suppose it would depend on the strength of the concentration inequality. Is subGaussian enough? (Sorry, I know this should be a separate question.) – Chill2Macht Sep 22 '18 at 22:28
  • 1
    Off the top of my head, I can't come up with a good sufficient condition to make it true. It's definitely worth asking a separate question though. – Robin Ryder Sep 23 '18 at 07:35
0

(Community wiki)

The answer to 2. is also no I think. The fact that $\mathbb{E}V_n$ is the expectation of a random variable is not relevant, since given any deterministic sequence $a_n$ one can find a sequence of random variables $V_n$ such that $a_n = \mathbb{E} V_n$. So what the question really reduces to asking is:

$$ \lim_{n \to \infty} \frac{a_n}{ b_n} = c \quad \implies a_n = b_n + \Theta(c) \,.$$

I claim that (I think this is true definitionally) $$ a_n = b_n + \Theta(c) \quad \iff \quad a_n - b_n = \Theta(c) \,. $$

So this is obviously false if $c = 0$. Take $a_n = \frac{1}{n^2}, b_n = \frac{1}{n}$, then clearly for no $n_0 \in \mathbb{N}$ do we have that $0 = (k_2)0 \le \frac{1}{n^2} - \frac{1}{n} \le (k_1)0 = 0$ for all $n \ge n_0$ (regardless of the choice of $k_1, k_2$).

Chill2Macht
  • 5,639
  • 4
  • 25
  • 51