I am suspecting the answer is yes, but I'd appreciate help in proving it (even though we know that the estimator is BLUE, so it should probably hold).
For context: An Inverse-variance weighting is when we have a bunch of estimators for some parameters (say, $\mu$), in which we know the variance of each of them ($\sigma_i$). In such a case, the minimal variance estimator is to use the following weighted estimator:
$$ \hat{y} = \frac{\sum_i y_i / \sigma_i^2}{\sum_i 1/\sigma_i^2}$$
At the same time, in Simple linear regression, we can find that the intuition for the slope is that it's actually a weighted average of estimating the slope for each data point, as follows:
$$ \hat \beta = \frac{ \sum_{i=1}^n (x_i - \bar{x})(y_i - \bar{y}) }{ \sum_{i=1}^n (x_i - \bar{x})^2 } = \frac{ \sum_{i=1}^n (x_i - \bar{x})^2\frac{(y_i - \bar{y})}{(x_i - \bar{x})} }{ \sum_{i=1}^n (x_i - \bar{x})^2 } = \sum_{i=1}^n \frac{ (x_i - \bar{x})^2}{ \sum_{i=1}^n (x_i - \bar{x})^2 } \frac{(y_i - \bar{y})}{(x_i - \bar{x})} $$
Thinking of each observation getting the weight of: $w_i = \frac{ (x_i - \bar{x})^2}{ \sum_{i=1}^n (x_i - \bar{x})^2 } $.
Hence, the question is, is it true (when using the usual normality assumption) how to show it exactly?
Asymptotic result
If we assume the sample size goes to Infinity, then $\bar y \rightarrow \mu$. Hence:
$Var(\frac{(y_i - \bar{y})}{(x_i - \bar{x})}) = \frac{1}{ (x_i - \bar{x})^2} V(y_i - \mu) = \frac{\sigma^2}{ (x_i - \bar{x})^2}$
This means we can think of the real weight as:
$$w_i = \frac{ (x_i - \bar{x})^2 \frac{1}{\sigma^2} }{ \sum_{i=1}^n (x_i - \bar{x})^2 \frac{1}{\sigma^2}} = \frac{ (x_i - \bar{x})^2 }{ \sum_{i=1}^n (x_i - \bar{x})^2 } $$
So this shows that indeed, for the n goes to inf case, the slope estimator is an inverse-variance weight. But I'm not sure how to do it for finite samples since there is a dependency structure between $y_i$ and $\bar y_i$. It probably requires going through the multivariate case (see here), but I'd love for some help on how to do it.
Thanks :)