13

Take an expectation of the form $E(f(X))$ for some univariate random variable $X$ and an entire function $f(\cdot)$ (i.e., the interval of convergence is the whole real line)

I have a moment generating function for $X$ and hence can easily calculate integer moments. Use a Taylor series around $\mu \equiv E(x)$ and then apply the expectation in terms of a series of central moments, $$ E(f(x)) = E\left(f(\mu) + f'(\mu)(x - \mu) + f''(\mu)\frac{(x - \mu)^2}{2!} +\ldots\right) $$ $$ =f(\mu) + \sum_{n=2}^{\infty} \frac{f^{(n)}(\mu)}{n!}E\left[(x - \mu)^n\right] $$ Truncate this series, $$ E_N(f(x)) = f(\mu) + \sum_{n=2}^{N} \frac{f^{(n)}(\mu)}{n!}E\left[(x - \mu)^n\right] $$


My question is: under what conditions on the random variable (and anything additional on $f(\cdot)$ as well) does the approximation of the expectation converge as I add terms (i.e. $\lim\limits_{N\to\infty}E_N(f(x)) = E(f(x))$).

Since it does not appear to converge for my case (a poisson random variable and $f(x) = x^{\alpha}$), are there any other tricks for finding approximate expectations with integer moments when these conditions fail?

jlperla
  • 485
  • 3
  • 12
  • 1
    see here: http://stats.stackexchange.com/questions/70490/taking-the-expectation-of-taylor-series-especially-the-remainder – Jonathan Jul 10 '15 at 01:34
  • @Jonathan Thank you. See my edits now that it has become clearer. Very helpful, though I couldn't quite crack it. From this, it appears that a sufficient condition for this to work is that my random variable is strongly concentrated? Though I am having trouble cracking exactly how to use Hoeffding's Inequality, etc. to compare to these notes. – jlperla Jul 10 '15 at 04:10
  • What do you mean "a poisson random variable and $f(x)=x^α$"? Is that one case or two, and what is the pdf? – Carl Feb 22 '18 at 04:54
  • @Carl This is a few years back, but if I remember, the variable was $x \sim Poisson(\lambda)$ for some $\lambda$ with PDF from https://en.wikipedia.org/wiki/Poisson_distribution. That $f(x)$ was the function I was taking the expectation over. i.e. $E(f(x))$ – jlperla Feb 22 '18 at 04:58
  • Not sure what you are asking. How about that the higher moments $m_k$ of the [Poisson distribution about the origin](https://en.wikipedia.org/wiki/Poisson_distribution#Higher_moments) are Touchard polynomials in $\lambda$: $$m_k = \sum_{i=0}^k \lambda^i \left\{\begin{matrix} k \\ i \end{matrix}\right\},$$where the {braces} denote Stirling numbers of the second kind? – Carl Feb 22 '18 at 05:16
  • Thank you. I think the poisson was just an example of one of the random variables where I ran into the problem – jlperla Feb 22 '18 at 05:18
  • I understand. Finding an appropriate series expansion will often not be the first one that is tried; it's a bit of an art form. – Carl Feb 22 '18 at 05:30

2 Answers2

1

By your assumption that $f$ is real-analytic, $$ y_n = f(\mu) + f'(\mu)(x - \mu) + f''(\mu)\frac{(x - \mu)^2}{2!} + \ldots + f^{(n)}(\mu)\frac{(x - \mu)^n}{n!} $$ converges almost surely (in fact surely) to $f(x)$.

A standard condition under which a.s. convergence implies convergence of expectation, i.e. $$ E[f(x)] = E [ \lim_{n \rightarrow \infty} y_n] = \lim_{n \rightarrow \infty} E [y_n], $$ is that $|y_n| \leq y$ a.s. for some $y$ such that $E[y] < \infty$. (Dominated Convergence Theorem.)

This condition would hold if the power series converges absolutely a.s., i.e. $$ y = \sum_{n \geq 0} |f^{(n)}(\mu)| \, \frac{|x - \mu|^n}{n!} < \infty \;\; a.s. $$ and $$ E[y] < \infty. $$

Your example of a Poisson random variable and $f(x)=x^{\alpha}$, $\alpha \notin\mathbb{Z}_+$, would suggest that the above integrability of absolute limit criterion is the weakest possible, in general.

Michael
  • 2,853
  • 10
  • 15
-1

The approximation will converge if the function f(x) admits to power series expansion i.e. all derivatives exist. It also will be fully achieved if derivatives of a specific threshold and above are equal to zero. You can refer to Populis[3-4] and Stark and Woods [4].

  • "It also will be fully achieved if derivatives of a specific threshold and above are equal to zero." If the derivatives exist and are equal to zero, isn't that another way of saying polynomial? – Acccumulation Nov 14 '18 at 19:12
  • 1
    This is not true. When "all derivatives exist" at the point of the power series expansion, the power series need not converge *anywhere.* (The standard example is the Maclaurin series of $e^{-1/x^2}.$) Another is that even when the series does converge at some point, it need not converge everywhere. A simple example is the Maclaurin series of $1/(1-x).$ When that occurs, convergence depends on the details of the random variable. For instance, suppose $X$ has any Student t distribution and consider $$1/(1-X)=1+X+X^2+\cdots+X^n+\cdots.$$ Eventually, $E(X^n)$ doesn't even exist! – whuber Nov 14 '18 at 19:34