3

I am interested in the expectation and the variance of the maximum of several independent, normal distributed variances. That is, given a set of $I$ different RVs with $X_i \sim \mathcal{N}(\mu_i, \sigma_i^2)$, I want to find $$ \mathbb{E}[\max~X_i], \\ \text{Var}[\max~X_i]. $$

I have found Ross' "Computing Bounds on the Expected Maximum of Correlated Normal Variables", but the method there given requires a numerical integration. I am interested in a closed form and would prefer a closed form approximation over an exact iterative method.

Anyone can point me into the right direction?

Elvis
  • 11,870
  • 36
  • 56
bayerj
  • 12,735
  • 3
  • 35
  • 56
  • 2
    Related: [Covariance of INID order statistics](http://stats.stackexchange.com/questions/41438/covariance-of-inid-order-statistics). You won't find a closed form in general, even for the expectation. I'm curious why you would find a closed form approximation preferable to an iterative method that could be more accurate: your response might help guide answers to this question. – whuber Nov 20 '13 at 13:34
  • 1
    @Brendon Independent, the article happens to have information on that as well. – bayerj Nov 20 '13 at 13:35
  • 2
    @whuber I prefer a closed form so that it can be used as part of an optimization, for which I need a derivative. Thus, what I really mean is that the solution needs to be in a differentiable form. – bayerj Nov 20 '13 at 13:36
  • Are you sure you need a derivative for your optimization? There are several derivative-free methods (ones that don't substitute a finite difference approximation) available, Nelder-Mead being the most widely known for multidimensional problems. – jbowman Nov 20 '13 at 13:47
  • 1
    I haven't had time to look at it in detail, but the paper _On the Maximum of Bivariate Normal Random Variables_, by Alan P. Ker, looks like it might be a useful starting point. – Brendon Nov 20 '13 at 14:18
  • @jbowman The optimization is too complicated for derivative-free methods. – bayerj Nov 20 '13 at 15:40
  • Related: http://stats.stackexchange.com/questions/229073/variance-of-maximum-of-gaussian-random-variables – kjetil b halvorsen Jan 02 '17 at 18:38
  • If you need the derivative of the Emax then these could perhaps be found as the probabilities Pr(j = max_i {X_i}), so maybe what you need is rather closed form of choice probabilities. – Jesper for President Dec 05 '18 at 21:28

1 Answers1

3

You can find a closed form (well, if you accept to use special functions) for the density of $X = \max (X_1, \dots, X_n)$.

Let $F_i, f_i$ be the cdf and the density of $X_i$, for $i=1, ..., n$. The cdf of $X$ is

$$\begin{aligned} F(x) &= \mathbb P(X \le x) \\ &= \mathbb P(X_1 \le x, \dots, X_n \le x) \\ &= \mathbb P(X_1 \le x) \cdots \mathbb P( X_n \le x) \\ &= F_1(x) \cdots F_n(x). \end{aligned}$$

Its density is then obtained by derivation: $$f(x) = F(x)\left( {f_1 \over F_1} + \cdots + {f_n \over F_n} \right).$$

Using this, you can find expected value and variance to a good accuracy and reasonable computing time with numerical integration procedures (cf integrate in R).

I bet that in this case, you can permute integral and derivation with respect to the parameter, so you can obtain the derivatives in a similar way.

Elvis
  • 11,870
  • 36
  • 56
  • Thanks, but I need a closed form; no numerical integration. – bayerj Nov 21 '13 at 10:07
  • 1
    The difference is thin. Many 'special functions' are computed through iterative procedures, even if you don’t see it. I understand that you want to avoid doing lots of iteration so that you get something fast enough. In that case, just write the numerical integration as a sum, using, say, 10 trapezoids. This will produce a closed form approximation as requested. – Elvis Nov 23 '13 at 05:54