There are two parts to this question:
Should the asymptotic variance-covariance matrix be positive definite? The answer is yes, although in some situations this can probably be weakened to positive semi-definite. If this asymptotic VCE has negative eigenvalues, then the asymptotic distribution of the test statistic is supported on the negative half-line -- in other words, the test statistic is allowed to take negative values, so none of the $\chi^2$ results could work. With the asymptotic VCE that has non-negative eigenvalues some of which are zero, this problem does not bite you, but then you have another problem of figuring out what the degrees of freedom (= number of strictly positive) eigenvalues is. If you have the spectrum that looks like {4, 1, 0.01, 1e-5}, would the last eigenvalue converge to a valid positive, is it a computer round-off error from zero, or is it just a valid non-zero in the finite sample, but would converge to a zero eigenvalue eventually?
In finite samples, little is guaranteed. Sometimes you will have a positive definite matrix in the middle part of Hausman test, so things will be fine. Sometimes, you can get a non-pd matrix when you subtract two variance estimators; this could be a small sample effect, or this could indicate that your model is not correctly specified, so what you think is an asymptotically efficient estimator may not actually be one.
In linear regression situations, including some of the instrumental variable models, you can push the linear algebra of the relevant matrices far enough to establish that the ultimate matrix is positive definite. The requirement is still there, but you can avoid the guesswork of figuring out what's going on with that matrix.