1

The Welch-Satterthwaite equation (Wikipedia) giving approximate degrees of freedom is used when the populations don't have equal variances. This result is obtained by equalizing the moments. However, according to Scheffé [H.Scheffé, The Analysis of Variance, 1959] and some more recent texts, the degree of freedom can and should be understood as the dimension of spaces spanned by canonical estimable functions (linear parametric functions with unbiased estimate).

My question is how to interpret the degree of freedom given by Welch-Satterthwaite approximation in terms of basis of vector spaces? It is usually non-integral and could not be understood as a common dimension of some vector space. It seems very unnatural. Or is there any reference that treats this problem?

Glen_b
  • 257,508
  • 32
  • 553
  • 939
Henry.L
  • 2,260
  • 1
  • 13
  • 32

1 Answers1

1

Your assertions about interpretation of degrees of freedom will hold under certain conditions. Those conditions don't apply here.

We aren't actually dealing with a t-distribution for the statistic at all; the thing under the square root in the denominator isn't a scaled chi-square.

You do have a distribution for that term whose shape is affected by the relative sizes of the variances and by the sample sizes. However, to a rough approximation we can treat it as a (scaled) chi-square distribution whose shape parameter may be approximated by computing the degrees of freedom given by the formula (which is to say there's two levels of approximation at this point, firstly when saying "it's sort of like a chi-square" and second when saying "with degrees of freedom computed this way").

Glen_b
  • 257,508
  • 32
  • 553
  • 939
  • Yes. We can treat the approximation as a scaled ratio of chi-square, but I am curious that under what conditions Scheffe's interpretation will hold? Earlier today I was thinking about a hierachical of vector spaces to put this interpretation. I do not expect that merely approxiamation will explain that. Is there any deeper relations? Or it only works when we regard this approxiamation of degree of freedom as an approximation to an assumed distribution?(i.e. Chi-square) And thanks for your edit and revision. :) – Henry.L Nov 02 '15 at 02:49
  • To be more precise, what are those conditions you mentioned? Is there a more recent reference? Thanks – Henry.L Nov 02 '15 at 02:50
  • 1
    "I am curious that under what conditions Scheffe's interpretation will hold?" ... presumably in the circumstances he was talking about in the context he was using the term. (Was he talking about degrees of freedom for sums of squares in some iid normal situation, say?). You might like to refer to whuber's answer [here](http://stats.stackexchange.com/questions/16921/how-to-understand-degrees-of-freedom/17148#17148) for some discussion of degrees of freedom. – Glen_b Nov 02 '15 at 02:57
  • I have to refer to Kendall's volumes mentioned in whuber's answer before I can make my own judgement. Another example which might be a support to your opinion is the use of Satterthwaite Approximation in Tukey's HSD when the equal-variance assumption failed to hold. Thanks – Henry.L Nov 02 '15 at 04:21