I assume $\mu$ is unknown and $n>p$. Then we know that an unbiased estimator of $\Sigma$ is
$$S=\frac1{n-1}\sum_{i=1}^n (X_i-\overline X)(X_i-\overline X)'$$
And that $(n-1)S$ has a Wishart distribution with $n-1$ degrees of freedom:
$$(n-1)S\sim W_p(n-1,\Sigma)$$
Provided $n>p+2$, the mean of $((n-1)S)^{-1}$ is known to be
$$E\left[\frac{S^{-1}}{n-1}\right]=\frac{\Sigma^{-1}}{n-p-2} \tag{$\star$}$$
So an unbiased estimator of $\Sigma^{-1}$ is
$$\widehat{\Sigma^{-1}}=\left(\frac{n-p-2}{n-1}\right)S^{-1}$$
To prove $(\star)$, we take help of the following theorem:
If $A\sim W_p(n,\Sigma)$ where $n$ is a positive integer and $n\ge p$, then for any fixed non-zero vector $a\in \mathbb R^p$, $$\frac{a'\Sigma^{-1}a}{a'A^{-1}a}\sim \chi^2_{n-p+1}$$
Using the above, we have for all $a$,
\begin{align}
E\left[a'A^{-1}a\right]&=\left(a'\Sigma^{-1}a\right) E\left[\frac1{\chi^2_{n-p+1}}\right]
\\&=\frac{a'\Sigma^{-1}a}{n-p-1} \quad,\quad\quad \small\text{ if }n>p+1
\end{align}
And hence,
$$E\left[A^{-1}\right]=\frac{\Sigma^{-1}}{n-p-1} \quad,\quad\quad \small\text{ if }n>p+1$$
For these results and much more on the estimation of precision matrix under normality, one can refer to Aspects of Multivariate Statistical Theory by R.J. Muirhead.