Let $Y_1,Y_2,\ldots,Y_n$ be independently distributed random variables such that $Y_i\sim\mathcal N(\alpha+\beta x_i,\sigma^2)$ for all $i=1,\ldots,n$. If $\hat\alpha$ and $\hat\beta$ be the least square estimates of $\alpha$ and $\beta$ respectively, then what is the joint distribution of $(\hat\alpha,\hat\beta)$?
We consider the model $y=\alpha+\beta x+\epsilon$ where $y$ is stochastic and $x$ is non-stochastic.
We have the paired observations $(x_i,y_i)$ and we assume that the errors $\epsilon_i\stackrel{\text{i.i.d}}{\sim}\mathcal{N}(0,\sigma^2)$ for all $i$.
Define $s_{xx}=\sum (x_i-\bar x)^2,\,s_{yy}=\sum(y_i-\bar y)^2$ and $s_{xy}=\sum(x_i-\bar x)(y_i-\bar y)$.
From the normal equations we have $\hat\alpha=\bar y-\hat\beta\bar x$ and $\hat\beta=\dfrac{s_{xy}}{s_{xx}}$.
Transforming $\mathbf Y=(Y_1,\ldots,Y_n)\to(Z_1,\ldots,Z_n)=\mathbf Z$ such that $\mathbf Y=\mathbf{A\,Z}$ where $\mathbf A$ is an orthogonal matrix with its first two rows $\left(\frac{1}{\sqrt n},\ldots,\frac{1}{\sqrt n}\right)$ and $\left(\frac{x_1-\bar x}{\sqrt{s_{xx}}},\ldots,\frac{x_n-\bar x}{\sqrt{s_{xx}}}\right)$.
From the distribution of the $Z_i$'s, one can show that $\bar y\sim\mathcal N\left(\alpha+\beta\bar x,\frac{\sigma^2}{n}\right)$ and $\hat\beta\sim\mathcal N\left(\beta,\frac{\sigma^2}{s_{xx}}\right)$, both independently distributed of each other.
From this, one gets $\hat\alpha\sim\mathcal{N}\left(\alpha,\sigma^2\left(\frac{1}{n}+\frac{\bar x^2}{s_{xx}}\right)\right)$.
So we have the two least square estimates each having a univariate normal distribution. They are definitely not independent; I have found they have correlation $\text{Corr}(\hat\alpha,\hat\beta)=-\frac{\sqrt{n}\bar x}{\sqrt{\sum x_i^2}}$.
But how can I find the joint distribution of $(\hat\alpha,\hat\beta)$ from this? I cannot simply conclude that they are jointly normal. They are probably jointly normal but how does it follow?
The following related posts came up during a search, but don't quite get the answer I am looking for:
Edit.
The joint distribution of $(\hat\alpha,\hat\beta)=(\bar y-\hat\beta\bar x,\hat\beta)$ is bivariate normal simply because $\bar y$ and $\hat\beta$ are independent normal variables (details here). This follows from the property that different linear combinations of independent normal variables (and hence jointly normal variables) are themselves jointly normal.