Suppose I have the following regression model:
$Y = b_1 \times T + b_2 \times Z + b_3 \times T\times Z + \epsilon$
where T is a randomly assigned treatment condition and Z is some covariate. I want to test the hypothesis that $b_1=0$, using
$s^2_{b_1}=\sigma^2 (X'X)^{-1}$
However, the design matrix X has missing data. The missingness is a function of Z. But, I do have a corrected variance/covariance matrix $\Sigma$ (using the Pearson-Lawley correction formula). I know that there's a relationship between $X'X$ and $\Sigma$. If I remember right, it's
$\Sigma = \Big[X-\frac{1}{n}ee'X\Big]'\Big[X-\frac{1}{n}ee'X\Big]\frac{1}{n}$
where e is a vector of ones. But, alas, I have a corrected version of $\Sigma$, not $X$. Any way to go from $\Sigma$ to $X$ (or $X'X$)?