1

Student's t-distribution that arise:

  • when estimating the mean of a normally-distributed population
  • when the sample size is small
  • and the population's standard deviation is unknown

I need to clarify the Hypothesis H and Question Q:

H: In general you can control the tails of normal distribution with standard deviation, the bigger it is the fatter the tails so I can make the tails as fat as I like.

Q: Is this comparison based on the normal distribution used to produce the t-distribution?

It looks to me the answer is yes, for the previous question but maybe I am missing something important.


In general, and I understood this for the first time thanks to @whuber "a fat-tailed distribution" is a probability distribution that exhibits a large skewness or kurtosis, relative to that of either a normal distribution or an exponential distribution according to Wikipedia

So my hypothesis H looks like is wrong at the very start, because you cannot alter the Normal distribution "fat tail" state with the standard deviation parameter.

It is pretty unclear why both the normal distribution or an exponential distribution are considered in here because their third and forth moment formula differs. In case of Exponentiation distribution 3rd and 4th moments are: 2 and 6, and in case of Normal distribution those moments are zero. So I wonder why Wikipedia has this definition of being "fat".


In my original idea of being on tail I would consider the area [-inf, $\mu -3\sigma]$ and $[\mu +3\sigma, \inf)$ and if some distribution has "fatter tail" would be to have large area under the curve in that tail area.

Easy Points
  • 294
  • 1
  • 9
  • I cannot understand this question because I don't know what this "standard deviation" is that you might be varying to "control the tails." A Normal distribution is a Normal distribution, period. Changing its parameters does not change the "fatness" of its tails in any meaningful or conventional sense. What "comparison" are you carrying out, then? – whuber Feb 25 '21 at 13:34
  • OK, I see what you meant. Thanks. – Easy Points Feb 25 '21 at 14:27
  • I altered the question, hope this helps a little bit. – Easy Points Feb 25 '21 at 14:57
  • Many threads here on CV discuss "fat," "heavy," or "long" tailed distributions. See [this search](https://stats.stackexchange.com/search?tab=votes&q=heav*%20tail%20distribution%20defin*) for links to some of the most relevant to the concept you express at the end. The second paragraph of my post at https://stats.stackexchange.com/a/86503/919 gives one simple intuitive definition. Please note that moments, like skewness or kurtosis, are not good for characterizing tail behavior. – whuber Feb 25 '21 at 16:00
  • @whuber, not bad paragraph, but the term "eventually F has more probability at large values than G" is unclear as we don't know what are these large values. It may be true for some large value but for some **larger** value may not be true. The problem on my side is that your answer posed abut 5 more questions (which is good) and I will not hesitate promote you because of that. -- In here it would be helpful to confirm to define the tails first. I presented what is tail (inside the question). To create another question based on your answer would be my approach. :) Thanks. – Easy Points Feb 25 '21 at 16:43
  • I agree the English may be unclear--but the definition I gave is mathematical, rigorous, and absolutely clear. – whuber Feb 25 '21 at 16:59
  • https://stats.stackexchange.com/questions/511206/question-based-on-the-whuber-detailed-answer-on-fat-tailed-term – Easy Points Feb 25 '21 at 17:35

1 Answers1

1

Your "title" question seems more clear to me than your detailed question. So I ll try to answer the former.

Why is t-distribution fatter? because there is more uncertainty. Since we do not know the standard errors, there is more uncertainty of the t-statistic. However, the larger the sample is, the more accurate the Standard deviation estimate will be, thus t-distribution will converge with the normal distribution.

Now, you are partially correct. In a normal distribution, the larger the variance, the farther the tails will spread. However, a normal distribution has a very nice "ratio" of dispersion around the tails, and concentration around the means. In other words, once you standardized a variable, (so that it has a mean zero and standard deviation 1), if that variable follows a normal distribution, it will always look the same.

The t distribution "ratio" is larger than the one for normal distribution. So even if you standardize the variable, the distribution will depend on the degrees of freedom of the distribution. lower degrees of freedom, more uncertainty, fatter tails. Higher degrees of freedom, less uncertainty, t-distribution will look more and more like a normal distribution.

HTH F

Fcold
  • 644
  • 2
  • 10
  • But fatness of tails refers to extremity of potentially observable data. Larger variance does not imply fatter tails: You will get points that are farther from the mean, yes, but they are not necessarily extreme. – BigBendRegion Feb 25 '21 at 12:46
  • True, I should say, the larger the variance, the longer the tails. That is why I tried to keep the idea of "ratio" between dispersion and concentration. I corrected this in my answer – Fcold Feb 25 '21 at 13:26