2

Suppose I have a data frame and performed a linear regression analysis such like this:

df<-data.frame(
               y=rnorm(100,2,3),
               x1=rbinom(100,1, 0.3),
               x2=sample(1:3, 100,T),
               x3=rnorm(100,40,3)
               )

require(car)

mod<-lm(y~x1+x2+x3, data=df)

t-test

coef(summary(mod))
            Estimate Std. Error  t value Pr(>|t|)
(Intercept)  1.75062     3.9944  0.43827 0.662176
x1           0.02640     0.7141  0.03697 0.970583
x2           0.93436     0.3453  2.70619 0.008055
x3          -0.05201     0.1010 -0.51501 0.607731

F test

Anova(mod)
Anova Table (Type II tests)

Response: y
          Sum Sq Df F value Pr(>F)   
x1             0  1    0.00 0.9706   
x2            59  1    7.32 0.0081 **
x3             2  1    0.27 0.6077   
Residuals    778 96                  
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

As expected, F values are the squared t values and the p values are same. I'm wondering whether the t test from the lm is same as the type II one degree of freedom F test. Is there some differences between them?

David Z
  • 1,288
  • 2
  • 15
  • 23

1 Answers1

2

An F test is equal to the square of the T test. However, all of your tests will be the same using type III sum of squares not type II. Though without an interaction type II and type III are the same (as you can see from your results).

Instead of writing out the proof of the equivalence, see the answer here: Prove F test is equal to T test squared

le_andrew
  • 1,255
  • 8
  • 10