0

I have a normal distribution with a mean of 1. This distribution needs to have:

P(X ≤ 0) = 0.005
P(X ≤ 2) = 0.995

I've calculated the standard deviation required to achieve this as ~0.3882. I believe this is correct, as when I back calculate x ≤ 0 and x ≤ 2 for a normal distribution with the mean of 1 and the standard deviation of 0.3882, I get 0.005 and 0.995 which is what I expect.

When I plot a normal distribution with a mean of 1 and a standard deviation of 0.3882, I get probabilities greater than 1. For example, P(X = 1) = 1.028.

This seems to be saying, that X = 1 has a chance of greater than 1 ie beyond certain.

Is this correct? It does not intuitively feel right.

The pdf and cdf are below: CDF and PDF for a normal distribution with a mean of 1 and standard deviation of 0.3882

hojkoff
  • 101
  • You have to be careful here: In the case that the random variable has finitely many values then the density function really is the probability (i.e. $P[X=k]$ is the density function $f_X(k)$) but for random variables with a continuous value space (like normally distributed variables: they map into the real numbers) this is not the case: $P[X=r]=0$ for each single $r \in \mathbb{R}$ and this is certainly unequal to the density function $f_X(r)$. So the density is not a probability anymore and can attain values bigger than one. – Fabian Werner Jul 06 '21 at 16:07
  • Actually it is even worse: when you make the std smaller and smaller, the spike around $0$ has to become bigger and bigger (because the integral, i.e. the area under the density function needs to stay exactly $1$), hence for each value $r$ irregardless how big, there is a normal distribution (with a potentially very small std) such that $f(0) > r$... i.e. normal distributions can attain arbitrarily big values :-) – Fabian Werner Jul 06 '21 at 16:10

0 Answers0