Let's double check what the delta method is:
Roughly, if there is a sequence of random variables Xn satisfying
$$
{\sqrt{n}[X_n-\theta]\,\xrightarrow{D}\,\mathcal{N}(0,\sigma^2)},
$$
where $\theta$ and $\sigma^2$ are finite valued constants and $\xrightarrow{D}$ denotes convergence in distribution, then
$$
{\sqrt{n}[g(X_n)-g(\theta)]\,\xrightarrow{D}\,\mathcal{N}(0,\sigma^2[g'(\theta)]^2)}
$$
for any function $g$ satisfying the property that $g′(\theta)$ exists and is non-zero valued.
Okay, so showing you have the first thing already is easy right?
So let $g(x) = x^2$ and you're set, aren't you? Just apply the theorem.
Then at the end do the relevant linear transformation (affecting variance and mean) so that you're just talking about what $g(X)$ will approximately be distributed as.
So in short:
i) State the theorem.
ii) Explain/show how the first part is okay
iii) state $g$
iv) apply the theorem
v) infer approximate distribution for $g(X)$
Which is pretty much what you do any time you want to use it.