I read that gaussian distribution is only defined by 2 moments i.e., mean and variance and can be only defined by these moments. Do we have distributions defined by just mean? Even for that matter, are there distributions defined by first three moments or say infinite moments. Can someone throw examples for such?
-
4"..gaussian... defined by two moments.... and can _only_ be defined by these moments." The word _only_ makes that last part sheer nonsense. The Gaussian distribution is defined by two parameters, but they don't _have_ to be mean and variance; they could be mean and, for example, the fourth moment. – Dilip Sarwate Aug 23 '20 at 22:06
3 Answers
There are several distributions that are only defined by one parameter. One example is the Rayleigh distribution, which is defined by a single parameter $\sigma$. This parameter is related to the mean by $\mu=\sigma\sqrt{\pi/2}$. Another example is the exponential distribution, which is defined by the parameter $\lambda$, and its mean and variance are given by $\mu=1/\lambda$ and $\sigma^2=1/\lambda^2$, respectively. Another well-known distribution defined by a single parameter is the Chi-square distribution.
There are also several distributions that are defined by their mean and variance, like the Gaussian. One example is the uniform distribution. Its lower and upper bounds $a$ and $b$ are related to the mean and variance by $a=\mu-\sqrt{3}\sigma$ and $b=\mu+\sqrt{3}\sigma$, respectively. There are many more examples of distributions defined by two parameters, for instance the Gamma distribution.
Note that the moments of a probability distribution need not necessarily be well-defined. For instance, the Cauchy distribution is defined by only two parameters, just like the Gaussian distribution, but neither its mean nor its variance exist, because the corresponding integrals do not converge.

- 151
- 3
To reframe the question a bit, let us recall that (raw or absolute) moments of a distribution $p$ are quantifies related to expectations (when defined): $E_p[(x-a)^\alpha]$ or $E_p[|x-a|^\alpha]$ and they can be centered, standardized, etc.
The range of $\alpha$ can be restricted to integers, or often positive integers $n$.
For the Gaussian, once the two first raw moments $\mu_1,\sigma=\mu_2$ are known, the other integers centered moments are determined directly: the odd ones are zero, the even ones are $(n-1)!!\sigma^n$. So, the Gaussian moments are determined by the first two moments (more at: Moments and Absolute Moments of the Normal Distribution). But I would not say that they are defined by them. Because you can derive the others... by knowing that the distribution IS Gaussian.
Given a sequence of moments $\mu_n$, can we characterize the underlying $p$, and vice-versa is one of the question under the hood of the problem of moments.
Given a sequence of moments, is a distribution uniquely defined? No, there is a counter-example provided with with a lognormal distribution and a periodically "perturbed" lognormal distribution; $p(x) := \frac{1}{x\sqrt{2\pi}} \exp \left(- \frac{(\log x)^2}{2} \right)$ and $q(x) := p(x) (1+ \sin(2\pi \log(x))$, see:
- n StackOverflow: When are probability distributions completely determined by their moments?
- in SE.stats: How is the kurtosis of a distribution related to the geometry of the density function?
- in SE.maths: Do moments define distributions? .
Under which additional properties are a distribution and its moments unique, this seems to remain an open problem to me, even under additional conditions (positivity, etc.). In When are probability distributions completely determined by their moments?, they say:
Roughly speaking, if the sequence of moments doesn't grow too quickly, then the distribution is determined by its moments. One sufficient condition is that if the moment generating function of a random variable has positive radius of convergence, then that random variable is determined by its moments.
One can check: The moment problem for references and an introduction to the classical moment problem on the real line with special focus on the indeterminate case.

- 2,077
- 1
- 20
- 33
-
2+1. The example you allude to, where the moments do not define the distribution, is illustrated in my post here on CV at https://stats.stackexchange.com/a/84213/919. – whuber Aug 24 '20 at 16:33
-
1
It is insightful to note that the Gaussian distribution is a maximum entropy distribution. Of all possible distributions with fixed first and second moments, the Gaussian maximises the entropy wrt a flat reference measure. In other words, the Gaussian is the least informative distribution for any fixed first and second moments.
With that in mind, let’s consider your question about whether any distribution exists that is defined by the first moment. Applying max ent in this case only works if restrict the domain, e.g., restrict ourselves to the positive domain in which case the answer is an exponential distribution, $$ p(x) = \lambda e^{-\lambda x} $$ with mean $1/\lambda$.
Other constrains lead to other max ent distributions; see e.g., wiki for examples.
It should be understood that max ent distributions are the maximum entropy ones of all distributions satisfying the constraint. They are not the unique distribution satisfying the constraints and no such unique distribution exists.

- 1,124
- 6
- 23
-
Max ent will work in many other cases: the support needn't be positive. Moreover, (among many possible examples) *any* one-dimensional location family of distributions can be parameterized by the mean, so max ent looks rather like a red herring: an interesting incidental fact that misses the main point. – whuber Mar 01 '22 at 22:31
-
Max ent with only a constraint on the first moment won't work if support not restricted. What is red herring, what is the main point, seems to be a matter of taste. – innisfree Mar 02 '22 at 05:20