5

Suppose $X \sim N(\mu, \sigma^2)$, is there a general way to describe the distribution (i.e. mean, variance, ...) of $Y=X^n$ for some constant $n$?

(assume $\mu \in (0,1)$ and somehow $X \in (0,1)$ most of the times if this makes it any easier)

Furthermore, how does $n$ affect the probability that $Y$ falls within a fixed interval, say $\pm\delta$, around its mean?

My intuition is that as $n$ increases, the distribution should become more condensed and thus has smaller variance, and overall the probablity should increase since the interval spans larger portion of the PDF?

Grumpy Civet
  • 153
  • 4
  • Not sure what this means "overall the probablity should increase ... ". – passerby51 Dec 21 '20 at 07:45
  • 1
    If $\mu=0$ then for even $n$ this can be described by a [generalized gamma distribution](https://en.m.wikipedia.org/wiki/Generalized_gamma_distribution). You could use rules for transforming variables to get the PDF and compare it with formulas for the mean and variance of that distribution to get to know the behaviour at larger $n$. – Sextus Empiricus Dec 21 '20 at 07:49
  • @passerby51p sorry for the confusion! here the probability is just the probablity that $Y$ falls within a fixed interval – Grumpy Civet Dec 21 '20 at 08:01
  • You could describe the moments of $X^n$ in terms of the moments of $X$. The mean of $X^n$ will be the $n$-th moment of $X$, the 2nd moment of $X$ will be the $2n$-th moment of $X$, and the variance will be the 2nd moment minus the square of the mean. – Sextus Empiricus Dec 21 '20 at 08:06
  • 1
    @GrumpyCivet, thanks. It depends on whether the interval contains zero or not. This is an interesting problem. Intuitively, and informally, in the limit, the distribution will be a mixture of a point mass at infinity (all the mass outside [-1,1] get mapped there) and a point mass at zero (all the mass in (-1,1) gets mapped there). I don't know how to formally describe it and it seems that formally the distribution does not converge. – passerby51 Dec 21 '20 at 08:08
  • @passerby51 thanks so much I'm still digesting your answer. Would it help if we only look at the part within (0,1) open? and the fixed interval should always be round mean of $Y$. – Grumpy Civet Dec 21 '20 at 08:20
  • @GrumpyCivet, not sure if I understand what you mean. You can decompose $Y$ into two pieces. I will edit my response to add more details. The mean of $Y$ doesn't seem to be that central to the question. If you look at a fixed interval around the mean that does not include zero, then the probability of falling in that interval decreases. (The probability of falling within any interval not containing zero decreases.) – passerby51 Dec 21 '20 at 08:37
  • @passerby51p thanks! I meant to restrict with the assumption that $\mu$ is within (0,1) and restrict $X$ to (0,1) as well. then wouldnt the probability of falling within such an interval increase in $n$? – Grumpy Civet Dec 21 '20 at 08:44
  • @GrumpyCivet, please see my updated response for how restricting $X$ can be used. Restricting $\mu$ doesn't seem to matter much. The same happens more or less even if $|\mu| > 1$. – passerby51 Dec 21 '20 at 09:01
  • 1
    Welcome to CV, Grumpy Civet. I like this question. – Alexis Dec 21 '20 at 18:35

2 Answers2

6

Let $Z_n := Z := g(X) := n X^n$. The relation between $Z$ and $Y$ is quite simple (one is a scaled version of the other). Let's figure out the distribution of $Z$. Assume for simplicity that $n$ is odd. Then, $g^{-1}(z) = (z/n)^{1/n}$. Hence, $|{g^{-1}}'(z)| = \frac1n (|z|/n)^{1/n-1}$. It follows that the density of $Z$ is $$ f_{Z_n}(z) = f_X(g^{-1}(z)) |{g^{-1}}'(z)| = \frac{n^{1/n}}{\sqrt{2\pi}} |z|^{\frac1n-1} e^{-(z^{1/n} n^{-1/n} - \mu)^2/(2\sigma^2)} $$ Since $n^{1/n} \to 1$ and $|z|^{1/n} \to 1$ for all $z \neq 0$, the density pointwise converges to $$ f_{Z_n}(z) \to \frac{1}{2\pi} e^{-(1-\mu)^2/2(\sigma)^2} |z|^{-1}, \quad z \neq 0 $$ which might be a good approximation for large $n$ (EDIT: it turns out it is not in general! See the edit.). Note, however, that this limit is not a density since it does not integrate to something finite. (Formally, one might be able to show that the distribution of $Z_n$ converges to a point mass at zero, in "some" sense. However, there is some mass that escapes to infinity, anything above 1 basically. So needs some care and the result might not be true. Informally, the limiting distribution is a mixture of a point mass at 0 and two point masses at $\pm \infty$ for $n$ odd.)


EDIT: Assume $n$ is odd. To clarify the situation, Let $X_1 := X\cdot 1_{\{|X| \le 1\}}$ and $X_2 := X\cdot 1_{\{|X| > 1\}}$ where $1_{\{|X| \le 1\}}$ is 1 if $|X| \le 1$ and zero otherwise and similarly for the other indicator function. We have $X = X_1 + X_2$ with $|X_1| \le 1$ almost surely and $|X_2| > 1$ almost surely. It is not hard to see that $$ Y_n := X^n = (X_1 + X_2)^n = X_1^n + X_2^n. $$ with $|X_1^n| \le 1$ and $|X_2^n| > 1$ a.s. Assuming that $X$ has a continuous distribution (so that $\mathbb P(X=1) = 0$), it is not hard to see that $X_1^n$ converges in distribution to a point mass at 0. However, $X_2^n$ does not converge in distribution. In fact, it should be straightforward to show that $$ P(X_2^n \in (0,t)) \to 0, \quad \text{as}\; n\to \infty$$ for any finite $t > 0$. This is what can be informally described as "the mass in the distribution of $X_2^n$ is escaping to infinity".

Since we also have $\mathbb P( X_1^n \in (s,\infty)) \to 0$ for any $s > 0$, it follows that $\mathbb P(Y_n \in (s,t)) \to 0$ for any $t > s > 0$. The only intervals that will have positive mass in the limit are those that contain 0. That is, if $I$ is an interval with $0 \in I$, then $$ P(Y_n \in I) \to P(|X| \le 1), \quad \text{as}\; n \to \infty. $$ Otherwise (that is, if $0 \notin I$), we have $\mathbb P(Y_n \in I) \to 0$.

passerby51
  • 1,573
  • 8
  • 11
  • So everything of $\vert X \vert$ below 1 will approach 0 and everything of $\vert X \vert$ above 1 will approach infinity (but because this part that approaches infinity spreads out the density goes to zero). – Sextus Empiricus Dec 21 '20 at 08:15
  • @SextusEmpiricus, that should be what happens. The mass between [0,1] gets pulled towards zero and the mass above it gets pulled towards infinity. – passerby51 Dec 21 '20 at 08:18
  • 1
    I think that the problem with a formal statement is that you describe the convergence of $f_{Z_n}$. But the cumulative distribution function $F_{Z_n}$ does not converge (not uniformly), so there is no convergence in distribution. – Sextus Empiricus Dec 21 '20 at 08:25
  • @SextusEmpiricus, yes, that is the case. Convergence in distribution seems to fails. Still it would be interesting to somehow describe an approximation for large "n" (if at all possible). The mass that escapes to infinity causes the tail of the distribution of $Z_n$ to get heavier and heavier as $n$ increases. – passerby51 Dec 21 '20 at 08:31
  • 2
    The pointwise limit of the densities is a truly terrible approximation. To see why, consider that the chance $|X|$ exceeds $1$ is positive (it's equal to $2\Phi(-1)$) and this is the limiting chance that $|X|^n$ exceeds $T$ for *any* positive threshold $T.$ That should make it obvious that much of the support of $X^n$ travels out towards infinity and that the sequence of variances diverges. – whuber Dec 21 '20 at 12:32
  • @whuber, thanks. I agree. I edited the answer to point that the approximation by the limiting density is problematic. – passerby51 Dec 21 '20 at 16:14
  • 1
    @whuber, I should add that in some cases it could be a reasonable approx., e.g., if much of the mass of X is concentrated in [-1,1], for example when $X \sim N(0.5, 0.01)$. This seems to be what OP has in mind in their problem statement. Although, there is going to be always a constant approx. error in the limit. – passerby51 Dec 21 '20 at 16:22
0

Far from an answer in general, but there is a formula for $\text{E}[X^n]$ if $\mu=0$. Then we have $$ \text{E}[\text{X}^n] = \begin{cases}0\,, & n \text{ odd }\\ \sigma^n(n-1)(n-3)\cdot\ldots\cdot 1\,, & n \text{ even }\end{cases} $$

StijnDeVuyst
  • 2,161
  • 11
  • 17