Questions tagged [moments]

Moments are summaries of random variables' characteristics (e.g., location, scale). Use also for fractional moments.

Moments are summaries of random variables' characteristics. Specifically, the $j$th moment of a random variable $X$ is defined as $$ \mu_j^{'} = {\rm E} (X^j), \quad j = 1, 2, \ldots $$ and the $j$th central moment of $X$ is $$ \mu_j = {\rm E} [(X -\mu_1^{'})^j], \quad j = 1, 2, \ldots . $$

The first moment $\mu_1^{'}$ is the expectation of $X$, often denoted $\mu = {\rm E} (X)$, and the second central moment is the variance, often denoted $\sigma^2 = {\rm var}(X) = {\rm E} [(X - \mu_1^{'})^2]$.

Analogous definitions hold for batches of data where the expectation is taken with respect to the empirical distribution function. Equivalently, "$E$" is replaced by averaging over the data. When the batch is a sample (of a population or process) these are known as "sample moments."

A notable use of moments is the method of moments, a procedure for statistical inference, which estimates the population distribution by matching its moments to (specified) empirical moments.

525 questions
77
votes
4 answers

What's so 'moment' about 'moments' of a probability distribution?

I KNOW what moments are and how to calculate them and how to use the moment generating function for getting higher order moments. Yes, I know the math. Now that I need to get my statistics knowledge lubricated for work, I thought I might as well ask…
PhD
  • 13,429
  • 19
  • 45
  • 47
41
votes
1 answer

Existence of the moment generating function and variance

Can a distribution with finite mean and infinite variance have a moment generating function? What about a distribution with finite mean and finite variance but infinite higher moments?
Mgf
  • 411
  • 1
  • 5
  • 3
28
votes
3 answers

Proof that moment generating functions uniquely determine probability distributions

Wackerly et al's text states this theorem "Let $m_x(t)$ and $m_y(t)$ denote the moment-generating functions of random variables X and Y, respectively. If both moment-generating functions exist and $m_x(t) = m_y(t)$ for all values of t, then X and Y…
26
votes
1 answer

Link between moment-generating function and characteristic function

I am trying to understand the link between the moment-generating function and characteristic function. The moment-generating function is defined as: $$ M_X(t) = E(\exp(tX)) = 1 + \frac{t E(X)}{1} + \frac{t^2 E(X^2)}{2!} + \dots + \frac{t^n…
26
votes
2 answers

Bias of moment estimator of lognormal distribution

I am doing some numerical experiment that consists in sampling a lognormal distribution $X\sim\mathcal{LN}(\mu, \sigma)$, and trying to estimate the moments $\mathbb{E}[X^n]$ by two methods: Looking at the sample mean of the $X^n$ Estimating $\mu$…
user29918
  • 363
  • 2
  • 6
24
votes
2 answers

Compute approximate quantiles for a stream of integers using moments?

migrated from math.stackexchange. I'm processing a long stream of integers and am considering tracking a few moments in order to be able to approximately compute various percentiles for the stream without storing much data. What's the simplest way…
jonderry
  • 361
  • 1
  • 2
  • 4
23
votes
2 answers

How would you explain Moment Generating Function(MGF) in layman's terms?

What is a Moment Generating Function (MGF)? Can you explain it in layman's terms and along with a simple & easy example? Please, limit using formal math notations as far as possible.
user366312
  • 1,464
  • 3
  • 14
  • 34
22
votes
2 answers

Non-normal distributions with zero skewness and zero excess kurtosis?

Mostly theoretical question. Are there any examples of non-normal distributions that has first four moment equal to those of normal? Could they exist in theory?
21
votes
1 answer

Error in normal approximation to a uniform sum distribution

One naive method for approximating a normal distribution is to add together perhaps $100$ IID random variables uniformly distributed on $[0,1]$, then recenter and rescale, relying on the Central Limit Theorem. (Side note: There are more accurate…
21
votes
3 answers

Moments of a distribution - any use for partial or higher moments?

It is usual to use second, third and fourth moments of a distribution to describe certain properties. Do partial moments or moments higher than the fourth describe any useful properties of a distribution?
Eduardas
  • 2,239
  • 4
  • 23
  • 22
20
votes
4 answers

What exactly are moments? How are they derived?

We are typically introduced to method of moments estimators by "equating population moments to their sample counterpart" until we have estimated all of the population's parameters; so that, in the case of a normal distribution, we would only need…
20
votes
1 answer

Physical/pictoral interpretation of higher-order moments

I'm preparing a presentation about parallel statistics. I plan to illustrate the formulas for distributed computation of the mean and variance with examples involving center of gravity and moment of inertia. I'm wondering if there is a physical…
James Koppel
  • 241
  • 1
  • 6
20
votes
2 answers

Intuition for moments about the mean of a distribution?

Can someone provide an intuition on why the higher moments of a probability distribution $p_X$, like the third and fourth moments, correspond to skewness and kurtosis respectively? Specifically, why does the deviation about the mean raised to the…
19
votes
1 answer

Whether distributions with the same moments are identical

Following are similar to but different from previous posts here and here Given two distributions which admit moments of all orders, if all the moments of two distributions are the same, then are they identical distributions a.e.? Given two…
18
votes
1 answer

Second moment method, Brownian motion?

Let $B_t$ be a standard Brownian motion. Let $E_{j, n}$ denote the event$$\left\{B_t = 0 \text{ for some }{{j-1}\over{2^n}} \le t \le {j\over{2^n}}\right\},$$and let$$K_n = \sum_{j = 2^n + 1}^{2^{2n}} 1_{E_{j,n}},$$where $1$ denotes indicator…
1
2 3
34 35