I understand that we can calculate the probability density function (PDF) by computing the derivative of the cumulative distribution formula (CDF), since the CDF is the antiderivative of the PDF.
I get the intuition for that (integrals denote the area under a curve, which is the accumulated probability under the curve of continuous functions).
I'm just wondering how to derive the CDF from the PDF of the Gaussian distribution, which is $$ f(x) = \frac{1}{\sigma \sqrt{2 \pi}} e^{- \frac{1}{2} \left(\frac{x - \mu}{\sigma} \right)^2} $$
I'm guessing we integrate this from negative infinity to positive infinity, but how do we do the whole process? I've seen the answer, and for some reason the answer seems to involve the "error" function, which is this:
$$ \text{erf} (z) = \frac{2}{\sqrt{\pi}} \int_0^z e^{-t^2} dt $$
If someone could explain how we derive the CDF from the PDF of Gaussian distribution, AND how that calculated CDF is related to the error function, I would be so grateful!