The proof is as follows: (1) Remember that the characteristic function of the sum of independent random variables is the product of their individual characteristic functions; (2) Get the characteristic function of a gamma random variable here; (3) Do the simple algebra.
To get some intuition beyond this algebraic argument, check whuber's comment.
Note: The OP asked how to compute the characteristic function of a gamma random variable. If $X\sim\mathrm{Exp}(\lambda)$, then (you can treat $i$ as an ordinary constant, in this case)
$$\psi_X(t)=\mathrm{E}\left[e^{itX}\right]=\int_0^\infty e^{itx} \lambda\,e^{-\lambda x}\,dx = \frac{1}{1-it/\lambda}\, .$$
Now use Huber's tip: If $Y\sim\mathrm{Gamma}(k,\theta)$, then $Y=X_1+\dots+X_k$, where the $X_i$'s are independent $\mathrm{Exp}(\lambda = 1/\theta)$. Therefore, using property (1), we have
$$
\psi_Y(t) = \left( \frac{1}{1-it\theta}\right)^k \, .
$$
Tip: you won't learn these things staring at the results and proofs: stay hungry, compute everything, try to find your own proofs. Even if you fail, your appreciation of somebody else's answer will be at a much higher level. And, yes, failing is OK: nobody is looking! The only way to learn mathematics is by fist fighting for each concept and result.