Consider a regression problem $Y_i = X_i'\beta+e_i$ with $\beta \in \mathbb{R}^p$. $e_i$'s are i.i.d. with $E(e)=0$ and $Var(e)=\sigma^2<\infty$.
My question is about the (asymptotic) variance of the estimator of $\sigma^2$; let us denote it by $\hat{\sigma}^2=\frac{1}{n}\sum_{i=1}^n(Y_i - \hat{Y}_i)^2=\frac{1}{n}RSS$.
Under the normality assumption e.g. $e_i\sim N(0,\sigma^2)$, we know that $\frac{RSS}{\sigma^2} \sim \chi^2_{n-p}$ (for example, see an explanation). This means that $Var(\hat{\sigma}^2)=\frac{1}{n^2}Var(RSS)=\frac{2\sigma^4(n-p)}{n^2}$, which is $O\left( \frac{1}{n} \right)$.
Question
Under what condition(s), can we assure that $Var(\hat{\sigma}^2)=O\left( \frac{1}{n} \right)$ without assuming the normality of $e_i$'s?
A related problem
For the i.i.d. case, this is true under finite forth-moment. Let $Z_i$'s be i.i.d. and $S^2=\frac{1}{n}\sum_{i=1}^n(Z_i - \bar{Z})^2$. Then $$\mbox{Var}(S^2)={\mu_4\over n}-{\sigma^4\,(n-3)\over n\,(n-1)}=O\left( \frac{1}{n} \right)$$where $\sigma^2=Var(Z)$ and $\mu_4=\mathbb{E}[(Z-E(Z))^4]$. For example, see an explanation.