Using the method of moments, one can try to approximate the sum of $\chi_{r}^{2}$ variables as $\sum a_{i}Y_{i}$ by equating the $n$-th movements of the sample with the $n$-th movement of the population, and "solve" the parameters this way. However I am stuck in the derivation of Satterthwaite appproximation. The author (Berger&Casella) suggested the following(page 314):
"..to do this we must match second moments, and we need $$\mathrm{E}\left(\sum^{k}_{i=1}a_{i}Y_{i}\right)^{2}=\mathrm{E}\left(\frac{\chi_{v}^{2}}{v}\right)^{2}=\frac{2}{v}+1$$
Applying the method of moments, we can drop the first expectation and solve for $v$, yielding $$\hat{v}=\frac{2}{\left(\sum^{k}_{i=1}a_{i}Y_i\right)^{2}-1}$$"
My naive question is why we can drop the expectation at all? This is not clear from the author's description of the method of moments, in which one merely equate $$m_{j}=\frac{1}{n}\sum^{n}_{i=1}X_{i}^{j}\text{ with } EX^{j}$$ And it seems clear to me that the expectation sign cannot be dropped. Similarly in the last step of the derivation of the approximation formula, the author suggested:
"..substituting this expression for the variance and removing the expectations, we obtain...."(page 315)
Can anyone give a hint? Sorry the question is really "low".
Edit:
A fellow at here suggested that the method of moments assume $E(Z)=Z$ because one equals the two moments. I do not think this follows straightaway from the definition. Even when $j=1$ one has to equate $\frac{1}{n}\sum^{n}_{i=1}X_{i}$ with $EX^{1}$. I do not think this implies $E(Z)=Z$ in general, such that one can use $Z=\sum a_{i}Y_{i}$.