0

I am new to statistics and trying to prove that in simple linear regression, the $SS_{\text{Regression}}/\sigma^2$ can be expressed as the square of a standard normal. ( Where $SS_{\text{Regression}} = \sum_{1}^n(\hat Y_i - \bar Y)^2$ is the sum of squares due to regression)

I tried searching for Cochran's theorem but the theorem with heavy matrix theory is heavy for me right now. Is there a simpler proof for the case of simple linear regression?

Help is deeply appreciated!

MathMan
  • 213
  • 2
  • 8
  • https://stats.stackexchange.com/q/362590/119261 – StubbornAtom Mar 27 '21 at 06:56
  • @StubbornAtom This article seems to show that the $MS_{\text{Residuals}}$ follows a $n-2$ chi square distribution. Whereas actually, my query is for the distribution of $MS_{\text{Regression}}$ – MathMan Mar 27 '21 at 16:04

1 Answers1

2

Consider the model $$y_i=\alpha+\beta x_i+\varepsilon_i\quad,\,i=1,2,\ldots,n$$

The least square estimators of $\alpha,\beta$ are then given by

$$\hat\alpha=\bar y-\hat\beta \bar x\qquad,\qquad\hat\beta=\frac{s_{xy}}{s_{xx}}$$

where $$s_{xy}=\sum_{i=1}^n (x_i-\bar x)(y_i-\bar y) \quad \text{ and } \quad s_{xx}=\sum_{i=1}^n (x_i-\bar x)^2$$

When the errors $\varepsilon_i$ are distributed as independent $N(0,\sigma^2)$ variables, we have

$$\frac{\hat\beta \sqrt{s_{xx}}}{\sigma}\sim N\left(\beta \sqrt{s_{xx}},1 \right) \tag{1}$$

Now

$$\hat y_i=\hat\alpha+\hat\beta x_i=\overline y+\hat\beta(x_i-\overline x)$$

Therefore, $$\frac{SSR}{\sigma^2}=\frac1{\sigma^2}\sum_{i=1}^n (\hat y_i-\overline y)^2=\left(\frac{\hat\beta \sqrt{s_{xx}}}{\sigma}\right)^2$$

From $(1)$, it follows that under normality of errors, $SSR/\sigma^2$ has a noncentral $\chi^2$ distribution with $1$ degree of freedom and noncentrality parameter $\beta^2 s_{xx}$. So it is not true in general that $SSR/\sigma^2$ can be expressed as the square of a standard normal variable.

StubbornAtom
  • 8,662
  • 1
  • 21
  • 67