Suppose that $X_1$ and $X_2$ are i.i.d random variables and that each of them has a uniform distribution on the interval [0,1]. Find the probability density function $Y = X_1 + X_2$
I understand that a uniform distribution is given as $f(x_1,x_2) = 1$ for $0 < x_1 < 1$ and $0 < x_2 < 1$.
Given that: $Pr[Y \le y]=Pr[X_1+X_2 \le y] = Pr[X_1 \le y - X_2]$
Where do I go from here?
I know that it must be within the interval $0 < y-x_1<1$.
While using the equation for the sum of two random variables:
$g(y) = \int_{-\infty}^{\infty}f(y-z, z)dz$
I know that $f(y-z, z) = 1$ because it is on a uniform distribution. When computing the boundaries, do I go from $0 $ to $y-x$, or $0$ to $y$ and why?
Having a quick look at the solution to this exercise, It seems to be of two parts. The first is what I am looking for, and the second is computing the integral when $1 < y <2$, though I cannot understand where these boundaries are taken from?
Update: Found this youtube link that explains this clearly also