4

The heaviest tailed smooth normalizable continuous distributions that I am familiar with are those with fat power-law tails $\frac{1}{x^{1+\alpha}}$, e.g. a Pareto with $\alpha\rightarrow 0^+$ or a Student's t with $\nu\rightarrow 0^+$, but are there distributions with heavier tails? I am curious about what is the worst case possible for a distribution that decreases monotonically away from a peak positive value towards a minimum of 0.

I think that the heaviest possible normalizable heavy tails are indeed those asymptotic to $\frac{k}{x}$ as $x\rightarrow\infty$ (where $k$ is some constant), but have not been able to prove it to my satisfaction nor find a clear statement of this in the literature. I wonder if my belief is either obvious to experts or wrong.

A couple of notes:

  • A function $f(x)$ is heavier-tailed than $g(x)$ for $x>0$ if there exists some finite $x_0$ such that for all $x>x_0$, $\int_{x}^\infty f(x)\,dx > \int_{x}^\infty g(x)\, dx$. (As discussed in the answer to: Which has the heavier tail, lognormal or gamma?)
  • It does not matter that the distribution has no finite moments, just that it integrates to 1 over the range $[0,\infty]$ (one-sided) or $[-\infty,\infty]$ (two-sided).
David Bailey
  • 141
  • 4
  • What about symmetric distributions with right and left tails that are equally heavy like the Cauchy distribution? – Michael R. Chernick Apr 12 '17 at 23:00
  • If the tail is proportional to $1/x$ then the integral diverges, so that is to heavy – kjetil b halvorsen Apr 12 '17 at 23:13
  • @Michael Chernick. Yes, the Cauchy is a member of the Student's t family, and is the distribution that I have normally assumed to be the most extreme possible, but a Cauchy actually has only 1/x^2 tails. – David Bailey Apr 12 '17 at 23:33
  • @kjetilbhalvorsen: I think "asymptotic to $k/x$" may mean $k / x^{1 + \epsilon}$, lim $\epsilon \rightarrow_+ 0$. – Cliff AB Apr 12 '17 at 23:38
  • @Cliff AB Yes, that is what I mean. $1/x$ is too heavy and the tail integral diverges, but $1/x^{1+\epsilon}$ has a finite tail integral as long as $\epsilon>0$. – David Bailey Apr 12 '17 at 23:43
  • It is true that the Cauchy is a member of the t family but there are two tails a right tail and a left. – Michael R. Chernick Apr 13 '17 at 00:08
  • 2
    @Michael Chernick Yes, I am certainly interested in symmetric distributions, but they can always be turned into one-sided distributions in $|x|$, so to avoid repetitive wording I only referred to the positive tail in my question. My apologies if that is confusing. – David Bailey Apr 13 '17 at 00:38
  • 1
    I seem to recall from looking at this long ago that we can consider a sequence of pairs of integrable and not-integrable functions (or at least ones that behave - for large $x$ - like this sequence), for small positive $\alpha$. I think it goes something like $1/x^{1+\alpha}$ is integrable but $1/[x\log(x)]$ is not; $1/[x(\log(x))^{1+\alpha}]$ is integrable but $1/[x\log(x)\log(\log(x))]$ is not -- and so forth, extending those pairs to more and more log(log(...)) terms. [I haven't double checked my recollection but it might help you locate something on what the actual sequences of things are] – Glen_b Apr 13 '17 at 01:38
  • The previous thought is assuming a bunch of stuff you didn't specify (but I assume you intend); this gives a sequence of bounds of progressively lighter tailed functions that are in a particular sense "close to" 1/x but not integrable, but a $(1+\alpha)$ power on the last term in the product in the denominator is, assuming I correctly remembered how it went ... (Edit: a quick play in Wolfram alpha suggests I recalled it more or less correctly) – Glen_b Apr 13 '17 at 07:04
  • 2
    Note that the characterization of heavy tails does not make sense. It needs to be modified to $$\int_{x}^\infty f(x)\,dx > \int_{x}^\infty g(x)\, dx$$ for all $x \ge x_0$. Note, too, that all distributions have finite medians and integrate to unity. – whuber Apr 13 '17 at 15:40
  • @Glen_b Thanks. I should have remembered recursive logs from long ago discussions about whether there is a slowest growing function. The answer is no, [there is alway a slower one](https://math.stackexchange.com/questions/1196691/is-there-a-slowest-divergent-function), which implies that for any heavy-tailed function $\frac{1}{x\,f_{slow}(x)}$, there is always a heavier tailed one of the form $\frac{1}{x\,f_{slower}(x)}$. – David Bailey Apr 13 '17 at 17:00

2 Answers2

6

There is no distribution which is more heavy-tailed than any other distribution.

Proof:

Assume $f$ is any PDF, and its CDF is $F$. We can always construct another distribution $$G(x) = 1 - \sqrt{1 - F(x)}, \quad g(x) = \frac{f(x)}{2\sqrt{1 - F(x)}}$$ which has havier tails, since: $$\int_x^\infty f(t)\, dt = 1 - F(x) < \sqrt{1 - F(x)} = 1 - G(x) = \int_x^\infty g(t) \, dt$$

for each $x$.

Kodiologist
  • 19,063
  • 2
  • 36
  • 68
Lucas
  • 5,692
  • 29
  • 39
  • 6
    Although this is a good observation, it doesn't seem to conform to the spirit of the question, which evidently concerns *classes* of tail behavior rather than individual distributions. Your construction converts a distribution with $O(x^{-(1+\alpha)})$ tail behavior into another with $O(x^{-(1+\beta)})$ tail behavior, for $\beta \ne \alpha$, but it doesn't seem to get you anything new. – whuber Apr 13 '17 at 18:27
  • Thanks. This answer and the answer to a similar Math Stack Exchange question that I somehow missed earlier ([What is the largest function whose integral still converges?](https://math.stackexchange.com/questions/1249899)) are nice proofs there is no single heaviest-tailed function. The answer to a related Math SE question ([Is there a slowest divergent function?](https://math.stackexchange.com/questions/1196691)) makes it clear that there is not even any countable heaviest-tailed collection such as a recursive sequence of functions. – David Bailey Apr 14 '17 at 04:35
  • @whuber In particular I still wonder if $k/x$ is indeed the asymptotic bound for any monotonically decreasing heavy tail. The question [What is the largest function whose integral still converges?](https://math.stackexchange.com/questions/1249899) presented a different class of functions that converge on $1/x$ (using recursive logs instead of a small extra power as I did), but the answers did not address whether $1/x$ is a general boundary between convergent and divergent extremely heavy tails. – David Bailey Apr 14 '17 at 05:04
  • 1
    It *has* to be. Consider any function $f$ for which there exists an $x\gt 0$ and $C\gt 0$ such that $f(t)\ge C/x$ for all $t\ge x$. Since $$\int_x^\infty f(t)dt \ge C\int_x^\infty \frac{dt}{t}=\lim_{y\to\infty}\log(y)-\log(x)$$ diverges, $f$ could not be a density. – whuber Apr 14 '17 at 13:57
  • @whuber: Yes, but don't we also need to show that no divergent function $f$ exists of a different class that has a tail below $C/x$. I argue that for large enough $x$ such a divergent tail would have to lie above $1/x^{1+\alpha}$, so since $\alpha$ can be made arbitrarily small, $f(x)$ must be indistinguishable from $1/x$. I think this is obvious for any smooth, continuous, monotonically decreasing $f$, since that excludes the weird possibilities I can think of (e.g. [singular](https://en.wikipedia.org/wiki/Singular_function) functions), but wondered if there was any loophole in my reasoning. – David Bailey Apr 15 '17 at 03:53
2

Great question! As you point out, Cauchy has a power-law tail. So on a log-log scale, the complementary cdf is linear.

But the only constraint on the function is that it never increases and goes to $-\infty$ in the limit. So you could swap the linear function out for a negative log, or even cook up an extreme example by inverting the increasing part of the gamma function.

Cauchy logccdf

Chad Scherrer
  • 271
  • 1
  • 2
  • Could you explain what you are referring to by "the function" and "the limit"? The function cannot be the PDF, obviously, but it cannot be the log PDF either, because there are many more constraints than the "only" one you claim. – whuber Jul 15 '19 at 15:23
  • "The function" is the function in the plot, the log of the complementary CDF as a function of log(x₀). As x₀→∞, this is required to go to -∞ (similar to the behavior shown for a Cauchy distribution). – Chad Scherrer Jul 15 '19 at 16:51
  • 1
    That will teach me to read the axis labels more carefully! (+1). – whuber Jul 15 '19 at 17:23