These quantities that you've noticed are called the factorial moments, so named for the falling factorial $(n)_k \equiv n(n-1)\cdots(n-k+1)$. The interpretation of the falling factorial is that $(n)_k$ gives the number of ways to create a $k$-permutation---element of $\mathcal{S}_n$ where $k$ elements are not stationary---on a set containing $n$ elements. Using this interpretation, we see that, using your notation, $G^{(n)}(1) = E[(X)_n]$ is the expected number of $n$-permutations on a set containing a random number $X \sim p(X)$ of elements.
To answer your question (or, I suppose, its title) more generally, probability generating functions are incredibly useful. Let us consider the function
$$
G(z) = \sum_{n \geq 0} p_n z^n
$$
as an analytic function of $z \in \mathbb{C}$.The property that you've noted above is certainly true---and now you know how to interpret it! Here are two easy ways to convert these factorial moments into regular moments $E[X^n]$.
First, there is a nice formula that relates $n$-permutations to counting the number of distinct ways to partition an $n$-element set.
Note that $X^n$ is the number of functions mapping from $A_n$,
a set with $n$ elements, into $B_X$, a set with $X$ elements, where here $X \sim p(x)$ again.
Now partition $A_n$ in some way into $k$ disjoint, nonempty partitions, and require that a function restricted to each partition mapping into $B_X$ be injective. Well, the number of injective functions on a partition of $A_n$ with $k$ elements into $B_X$ is given by $(X)_k$ from above, so we can write
$$
X^n = \sum_{k=0}^n a_{n,k} (X)_k,
$$
where the numbers $a_{n,k} \equiv \text{number of distinct partitions of a set with $n$ elements}$ are called the Sterling numbers of the second kind and are denoted $a_{n,k} \equiv {n\brace k}$.
Then, taking expectations, the above combinatorial relation becomes
$$
\begin{aligned}
E[X^n]&= E\left[ \sum_{k=0}^n {n\brace k}(X)_k \right]\\
&=\sum_{k=0}^n {n\brace k}E[(X)_k],
\end{aligned}
$$
a way for you to calculate ordinary moments!
This is great if you like combinatorics. Personally, I am not a huge fan, so here is another thing to notice. Let us consider the function $G(z)$ defined above as an analytic function---since it is infinitely (complex) differentiable, let us see what happens when we operate on it with the operator $\theta(z) \equiv z \frac{d}{dz}$:
$$
\begin{aligned}
\theta(z)G(z) &= z \frac{d}{dz}\sum_{n \geq 0} p_n z^n\\
&= z \sum_{n \geq 1}np_n z^{n-1} = \sum_{n \geq 1}n p_n z^n,
\end{aligned}
$$
so that $\theta(z)G(z)\big|_{z = 1} = E[X]$. How interesting! You will soon prove to yourself that $E[X^n] = \theta^n(z)G(z)\big|_{z=1}$, which is very helpful.
There are many more properties that I could share with you, but I will show you just two more. First is a general property of multiplicative-group frequency-space transforms, of which the probability generating function is an example. Suppose that $X_1,...X_k,...,X_N$ are independent rvs with respective generating functions $G_k$. What is the generating function of $X = \sum X_k$? Well, just note that $G_k(z) = \sum p^{(k)}_n z^n = E[z^{X_k}]$ by definition, so that we can write
$$
\begin{aligned}
G_X(z) &= E\left[ z^X \right]\\
&= E\left[z^{\sum X_k}\right]\\
&= E\left[ \prod z^{X_k}\right]\\
&= \sum_{n_1,...,n_N} \left[ \prod_k p^{(k)}_{n_k}z^{n_k} \right]\\
&= \prod_k \left[ \sum_{n_k}p^{(k)}_{n_k}z^{n_k}\right]\\
&= \prod_k G_k(z),
\end{aligned}
$$
where we can jump from sums to products (from joint distributions to factorization) by the independence assumption. You can replace the details and replicate this proof for the Laplace transform (moment-generating function) and Fourier transform (characteristic function) if you want.
The last thing to tell you is that, when we consider $G$ as an analytic function, there are many beautiful techniques from complex analysis that allow us to calculate expectations. Suppose that you know $G$ exists and is analytic in a region about $z^* \in \mathbb{C}$. Then Cauchy's integral formula says that
$$
\frac{d^n}{dz^n}G(z) = \frac{n!}{2\pi i}\oint_{\gamma}\frac{G(z)\ dz}{(z - z^*)^{n + 1}},
$$
where $\gamma$ is a closed curve encircling the origin in the domain of integration.
If you know how to compute this integral, you can be in business without taking a derivative (and, if you're like me, messing up all the algebra that comes with it...)