0

I was reading the book an introduction to statistical learning with R (http://www-bcf.usc.edu/~gareth/ISL/data.html), and came across this expression that I haven't seen before. Can anyone tell me what this means:

$$ {\bar x^{2} \over \sum_{i=1}^n(x_i - \bar x)^2} $$

The full equation (where I saw it is:) $$ SE(\hat \beta)^2 = \sigma^2 [{1 \over n} + {\bar x^{2} \over \sum_{i=1}^n(x_i - \bar x)^2}] $$

FYI. this is the equation for calculating the standard error of the coefficients of linear regression (ML).

Cam.Davidson.Pilon
  • 11,476
  • 5
  • 47
  • 75

2 Answers2

2

Your expression is a formula for the standard estimation error for a given parameter $\beta$ in terms of the sample mean, the observations.

$\sigma$ stands for the standard deviation of the errors $\epsilon$, assuming a model of the form $y=\beta + \beta_1 x + \epsilon$

$\epsilon$ is assumed to follow a normal distribution $N(0, \sigma^2)$

The part you highlight is just the square of the sample mean divided by $n*Var(x)$

David
  • 2,422
  • 1
  • 4
  • 15
0

This is an expression that most statistics students are taught to derive when learning linear regression. I'm not necessarily sure that that term has a great interpretation, it's just derived. I would take a look at this thread to see how this term pops up: Derive Variance of regression coefficient in simple linear regression

Emma Jean
  • 516
  • 2
  • 10