The multiple linear regression model is given by $$ \mathbf{y} = \mathbf{X} \mathbf{\beta} + \mathbf{\epsilon} \\ \mathbf{\epsilon} \sim N(0, \sigma^2 \mathbf{I}) $$
It is known that an estimate of $\mathbf{\beta}$ can be written as $$ \hat{\mathbf{\beta}} = (\mathbf{X}^{\prime} \mathbf{X})^{-1}\mathbf{X}^{\prime} \mathbf{y} $$
Hence $$ \textrm{Var}(\hat{\mathbf{\beta}}) = (\mathbf{X}^{\prime} \mathbf{X})^{-1} \mathbf{X}^{\prime} \; \sigma^2 \mathbf{I} \; \mathbf{X} (\mathbf{X}^{\prime} \mathbf{X})^{-1} = \sigma^2 (\mathbf{X}^{\prime} \mathbf{X})^{-1} $$
Let $\mathbf{x}_j$ be the $j^{th}$ column of $\mathbf{X}$, and $\mathbf{X}_{-j}$ be the $\mathbf{X}$ matrix with the $j^{th}$ column removed.
I am told that the variance of $\hat{\mathbf{\beta}}_j$ can therefore be expressed as $$ \textrm{Var}(\hat{\mathbf{\beta}}_j) = \sigma^2 [\mathbf{x}_j^{\prime} \mathbf{x}_j - \mathbf{x}_j^{\prime} \mathbf{X}_{-j} (\mathbf{X}_{-j}^{\prime} \mathbf{X}_{-j})^{-1} \mathbf{X}_{-j}^{\prime} \mathbf{x}_j]^{-1} $$
Can anyone shed some light on how to prove it?
Hints would suffice.