I would like to see when we standradize the data and then apply linear regression or Bayesian regression do we need intercept or no? or it has nothing with standardize?
Asked
Active
Viewed 34 times
0
-
1https://stats.stackexchange.com/questions/201919/utility-of-the-frisch-waugh-theorem – Christoph Hanck Feb 11 '22 at 10:21
1 Answers
1
You don't need it, no. You can see this yourself in R:
Build linear regression on mt cars dataset build into R
lm_unscaled_data <- lm(mpg~.,mtcars)
summary(lm_unscaled_data)
Looking at the summary, the intercept has an estimate of 12.3 with a standard error of 18.7
Now we Z score all the columns so they all have a mean of 0 and standard deviation of 1. For reasons that aren't completely clear to me the mean and sd won't be exactly these numbers, but they'll be incredibly close to 0 and 1. It's probably some R thing.
mtcars_scaled <- lapply(X = mtcars, FUN = scale)
mtcars_scaled <- data.frame(mtcars_scaled)
lm_scaled_data <- lm(mpg~.,mtcars_scaled)
summary(lm_scaled_data)
The intercept is now so small its basically 0, the p value is exactly 1.

Evolving_Richie
- 11
- 2