I am comparing linear regression with and without intercept for the general sampling case. For this, I have $n$ samples of two correlated random variables $X \sim N\left(0,\sigma_X^2\right)$ and $Y \sim N\left(0, \sigma_Y^2\right)$ with correlation $\rho$.
For the random samples, I calculate the linear regression models with and without intercept
(1) $y_i=\alpha_0+\alpha_1x_i+\epsilon_1$ and
(2) $y_i=\beta_1x_i+\epsilon_2$
Using numerical experiments, I have found that $E[\hat\alpha_1] = E[\hat\beta_1]$, which seems logical to me. However, I have also found thtat $\text{Var}(\hat\alpha_1) \neq \text{Var}(\hat\beta_1)$, which I am currently trying to understand.
In another question of mine, I have found that for the general sampling case $\text{Var}(\hat \alpha_1) = \frac{\sigma_Y^2}{\sigma_X^2} \frac{1-\rho^2}{N-3}$ for the model with intercept and am trying to find $\text{Var}(\hat\beta_1)$.
Overall, I am therefore trying to find $\text{Var}(\hat\beta_1)=\text{Var}\left(\frac{\sum x_iy_i}{\sum x_i^2}\right)$.
The denominator is clearly gamma distributed. However, the distribution of the numerator as a sum of products of normal distributed random variables is tough, not to mention the ratio.
Calculating $\text{Var}(\hat\beta_1)=E[\hat\beta_1^2] - E[\hat\beta_1]^2$ isn't much easier, I think.
After spending hours in the local university library and searching research papers, I am turning to CrossValidated for help (again).
Does somebody know a way to calculate the variance in question?