2

The generalized extreme-value distribution encompasses three classes of distributions:

  1. Frechet, which are regularly varying, infinite right limit.
  2. Gumbel, which are not regularly varying, infinite right limit.
  3. Weibull, which are regularly varying, finite right limit.

Is there a fourth class, one that fits in between the Gumbel and Weibull classes: which are not regularly varying, finite right limit?

If so, what is this class called?

And, is there a generalization of the generalized extreme-value distribution that encompasses this 4th class of distribution?

Isambard Kingdom
  • 429
  • 2
  • 12

2 Answers2

2

No. The extreme-value theorem (Fisher/Tippett/Gnedenko) gives the possible limits of a distribution of maxima (appropriate scaled), and they divide into three groups based on whether the extreme value index parameter is positive, zero, or negative. The generalised extreme value distribution is precisely the set of possible limits, and the three named subsets correspond to the positive, zero, and negative values of the index

Thomas Lumley
  • 21,784
  • 1
  • 22
  • 73
  • Thomas, thank you for your reply, but my question is not about the parameterization of the GEV per se, but rather about a class of distributions that appears not to be encompassed by the GEV. – Isambard Kingdom Jan 13 '21 at 16:11
  • My answer was also not about the parametrisation of the GEV per se. The GEV is already exactly the set of distributions that can be limits of maxima; there **aren't** any others to make up an expansion set. I mean, you could always make an even more generalised family that includes whatever distributions you like, but they wouldn't be possible extreme-value limits. – Thomas Lumley Jan 14 '21 at 04:11
  • Are you saying that block-maximum samples taken from a source distribution that is both right-limited and not regularly varying will not, themselves, be both right-limited and not regularly varying? – Isambard Kingdom Jan 14 '21 at 06:34
  • Either that or the limit doesn't exist; I don't know which. That's what the theorem says: *if* there's a limit, it's one of those three classes. – Thomas Lumley Jan 14 '21 at 07:12
  • Well, I don't know either, and that is my question. I can't believe that the limit doesn't exist, and, generally (at least per my reading of the literature), the right-limit and regularity properties are preserved even after sampling. Maybe I'm wrong about that, in which case I'm happy to learn. Let's let others weigh in on this (hopefully armed with some references I can see). – Isambard Kingdom Jan 14 '21 at 14:03
  • I'm guessing that the sampling distribution I'm curious about exists as some sort of singularity tucked in between the Gumbel and Weibull. – Isambard Kingdom Jan 14 '21 at 14:30
0

Reading various lecture notes and corresponding with colleagues outside this forum, I now understand that, as Thomas Lumley reminded me, the extreme value theorem (Fisher/Tippett/Gnedenko) only divides into three cases (Frechet, Gumbel, Weibull). There is not an additional case tucked in between the Gumbel and Weibull that accommodates samples from source distributions that are right-limited, not regularly varying. Instead, these distributions are asymptotically like the Gumbel distribution. What continues to amaze me, however, is that while the samples might be right-limited, the "lightness" of the tail (apparently) permits description by a distribution (Gumbel) that is not right-limited. I will continue to ponder this point, but for now, I consider this question closed.

Isambard Kingdom
  • 429
  • 2
  • 12
  • Consider a reversed Fréchet distribution, with upper end-point $\omega = 0$. The density has zero derivative $f^{(k)}(\omega) = 0$ for all derivation orders $k$. This is a nice example of distribution which belongs to the Gumbel domain, as can be proved by the Von Mises conditions. It can also be checked with simulations that the convergence is *very* slow. – Yves Feb 05 '21 at 07:37
  • Right, I accept this. What I don't have is intuition as to how in the world a right-limited source distribution can be in the domain of attraction of a Gumbel. Think of it in terms of sampling. I block sample from (say) an upper-limit lognormal. This is right limited, yet somehow the samples are distributed as a Gumbel which is not right limited. I need some intuition as to how this is possible. – Isambard Kingdom Feb 06 '21 at 16:50
  • 1
    I suspect that with dicrete-time autoregression $y_t = \phi y_{t-1} + (1 - \phi) \varepsilon_t$ with $0 < \phi < 1$ and a noise $\varepsilon_t$ with standard uniform distribution, the stationary distribution is in the Gumbel Domain, while its support is $(0, \, 1)$. By subsampling we get nearly independent observations and we can easily see with simulations that $y_t$ keeps far away from the upper end-point $\omega = 1$. This may provide hints on this strange behaviour which, I agree, should be mentioned in every EV textbook. – Yves Feb 06 '21 at 18:27
  • Another difficult way to look at this is the following: Consider a right-limited, non-regularly varying source distribution. We block-max sample it. Those samples are Gumbel distributed (which has no right limit). Does this mean that the parameters of the Gumbel are independent of the right-limit of the source distribution? I suspect that the parameters do depend on the right endpoint. – Isambard Kingdom Feb 08 '21 at 02:42
  • 1
    Yes they do because they relate to the tail quantile $U(n)$, see [my answer](https://stats.stackexchange.com/a/285991/10479). The problem you cite is mentioned at the end. – Yves Feb 08 '21 at 06:31
  • But given a set of block-max data that we know are the result of a right-limited, non-regularly varying process, meaning that the block max data are Gumbel distributed, is it possible to then infer the right limit of the source distribution? I have a hard time imagining that that would be possible, given that the Gumbel domain encompasses non-regularly varying sources that are both right-limited and not-right-limited. – Isambard Kingdom Feb 10 '21 at 20:37
  • 1
    Estimating the upper end-point $\omega$ of a distribution say $F_M$ from a sample does not require EV theory. As a general rule, when $F_M$ is within a parametric family, the ML estimate $\widehat{\omega}$ is the sample maximum. It can be "super consistent" if the density $F_M'$ is positive at $\omega$. However if $F_M$ is in the Gumbel domain as is the reversed Fréchet, all the derivatives of $F_M$ will vanish at $\omega$ and and the estimator $\widehat{\omega}$ will be very poor because very few data will come near $\omega$. Only a very slow convergence will occur. – Yves Feb 11 '21 at 14:45
  • Thank you, Yves. I've learned a lot with this exchange. – Isambard Kingdom Feb 12 '21 at 00:45