6

I have run into a problem in a proof of the bound for the rate convergence of an empirical risk function based on unbounded loss to the true model risk (Vapnik, Statistical Learning Theory, Theorem 5.4). The actual issue can be condensed very simply.

Let the non-negative random variable $t$ follow a distribution $F$ (not necessarily continuous) and have a finite moment of order $p>2$, so that $$\int_{[0,\infty)}t^pdF(t)<\infty$$ I would like to satisfy myself of the truth of the statement, given in the proof, that $$\int_{[0,\infty)}t^pdF(t)=p\int_{[0,\infty)}t^{p-1}(1-F(t))dt$$

The catch for me is that $F$ does not necessarily have a Radon-Nikodym derivative. I'm also wondering whether there is a way to obtain the answer without taking limits, working with the extended reals directly.

P.S. Formula errors are a regular feature of the book, but generally easy to trouble-shoot. It would therefore be helpful to receive answers based on modest additional assumptions (like the existence of $dF/dt$) or corrections.

Marko
  • 191
  • 7
  • 1
    This is a standard result, and the idea is [integration by parts](https://en.wikipedia.org/wiki/Lebesgue%E2%80%93Stieltjes_integration#Integration_by_parts) after expressing $dF=-d(1-F)$. There is an answer here, take $g(x)=x^p$: https://stats.stackexchange.com/questions/222478/expectation-of-a-function-of-a-random-variable-from-cdf – cwindolf Aug 15 '21 at 17:43

0 Answers0