I have the following code in MATLAB, which I believe calculates the probability of a certain point (p) in the normal distribution. I know sigma (variance) and mu (mean) based on calculations.
f = (1/sqrt(2 * pi * (sigma(x, y))^2)) * exp(-((p - mu(x, y))^2)/(2 * (sigma(x, y))^2));
if f >= P
cloud = false;
else
cloud = true;
end
My question is this, if p = 0.1 and P = 0.9, will my result be that there is at least a 90% chance that p will be less than or equal to 0.1?
This is for cloud cover, so I am looking at whether there will be at least a 90% chance that cloud cover will be 10% or less.