If I have a variable $X$ whose Gaussian distribution is known and let $f$ be a known function, is there a way to compute the distribution of $f(X)$ i.e. the resulting Gaussian distribution from this?
Is the result actually a Gaussian distribution?
If I have a variable $X$ whose Gaussian distribution is known and let $f$ be a known function, is there a way to compute the distribution of $f(X)$ i.e. the resulting Gaussian distribution from this?
Is the result actually a Gaussian distribution?
Addressing the last question in particular --
1) Consider $X$ being standard Gaussian (mean 0, variance 1), and $U = \Phi(X)$ where $\Phi()$ is the standard normal cdf. Does $U$ have a Gaussian distribution? Let's simulate (in R in this case):
u <- pnorm(rnorm(100000L))
hist(u,n=300)
... nope.
In fact you can work out that it must be standard uniform.
2) Consider $X$ being standard Gaussian (mean 0, variance 1), and $Y = f(X) = X^2$.
Does $Y$ have a Gaussian distribution? Let's simulate (in R):
y <- rnorm(100000L)^2
hist(y,n=300)
and what do we see?
... nope.
In fact you can work out that it's chi-squared(1).
In particular circumstances, certain results imply you can obtain an approximately Gaussian distribution... but it's not the general case. For example, if the mean is many standard deviations from 0, (e.g. $X\sim N(5,0.1^2)$), then $Y=X^2$ is at least approximately normal:
y <- rnorm(100000L,5,.1)^2
hist(y,n=300)
... and similarly $\exp(X)$, and $X^{1/3}$ and $log(X^2+\sqrt\pi)$, and a whole menagerie of other transformations will also give approximately Gaussian distributions in this case.
Edit:
You do get exact normality if you transform a Gaussian random variable with a linear transformation, so, as @DilipSarwate points out, if $X$ is Gaussian, $Y = a+bX$ is Gaussian (the case $b=0$ which was discussed by @gung and @whuber in comments is 'degenerate', but usually still counted as Gaussian when considering the whole location-scale family of Gaussians).
It's not quite the case that you can get to a normal only by linear transformation, though if you are restricted to monotonic transformations, I think this will be the case.
A nonlinear transformation of a normal that yields a normal:
Let $X$ be standard Gaussian. Let $F_1$ be the cdf of a $\chi^2_1$ random variable and $\Phi$ be the cdf of a standard normal. Then $\Phi^{-1}(F_1(X^2))$ is a nonlinear (and non-monotonic) transformation of $X$ where the resulting random variable is Gaussian.
y <- qnorm(pchisq(rnorm(100000L)^2,1))
hist(y,n=200)
In respect of the first question, if $Y=h(X)$ you can in principle work out the distribution of $Y$. If $X$ is continuous and $h$ is invertible, it goes like this:
If $F_X$ is the cdf of $X$ then $P(Y\leq y) = P(h(X)\leq y) = P(X\leq h^{-1}(y)) = F_X(h^{-1}(y)))$.
From there one can work out the density by differentiation, leading to the standard result:
$$f_Y(y)= f_X(h^{-1}(y)) \left|\frac{d h^{-1}(y)}{dy}\right|$$
In other cases things are more complicated, but in some cases may still be doable.
I think you are talking about change of variables.
If X is a continuous r.v. with pdf $f_X(x)$ and sample space S, if $g: S \rightarrow T$ is an invertible transformation with differentiable inverse $h = g^{-1}$, and $Y = g(X)$, then Y is a continuous r.v. with pdf $f_Y(y)$ defined by
$f_Y(y) = f_X(h(y)) \cdot |h'(y)|$.
For example, suppose $X \sim exp(\lambda)$, what is the distribution of $X^2$?
Well, here we have $g: (0,\infty) \rightarrow (0,\infty)$ defined by $g(x) = x^2$. The inverse if $h(y) = g^{-1}(y) = \sqrt y$
Then $h'(y) = 0.5 y^{-1/2}$.
The density of X is $f_X(x) = \lambda e^{-\lambda x}$ so $f_X(h(y)) = \lambda e^{-\lambda \sqrt y}$.
And finally multiplying by the derivative gives $$ f_Y(y) = f_X(h(y)) \cdot |h'(y)| = \lambda e^{-\lambda \sqrt y} \cdot 0.5 y^{-1/2} $$
This simplifies to $$f_Y(y) = \frac{\lambda e^{-\lambda \sqrt y}}{2\sqrt y}$$
There is your density for a function of a random variable.