An exercise asked to obtain properties of the lineal model $$E[y_i]=\beta x_i\qquad i=1,\cdots,n$$
where $Var[y_i]=\sigma^2$. In one of its sections, we had to calculate and estimator for $\beta$ using the least squares method, and I got
$$\hat{\beta}=\frac{\sum_{i=1}^n x_iy_i}{\sum_{i=1}^n x_i^2}$$
which is quite similar to the estimator $\hat{\beta_1}$ obtained with the same method for the linear model $E[y_i]=\beta_0 +\beta_1x_i$, that is
$$\hat{\beta_1}=\frac{\sum_{i=1}^ny_i(x_i-\bar{x})}{\sum_{i=1}^n(x_i-\bar{x})^2}$$ since is the same one with the exception that in the expression for $\hat{\beta}$,the term $\bar{x}$ does not appear. The same thing happened when I was asked to compute $Var[\hat{\beta}]$ and when finding the pivotal quantity for a confidence interval for $\beta$. My question is then the following
Is there an intuitive reason that explains why this happens? That is, could have I known that the term $\bar{x}$ that appears when there is an intercept $\beta_0$ vanishes by knowing only that the regression line is forced to pass throug the origin?
This is not homework, I already solved the exercise, but I'm trying to gain some intuition on why this things are like they are. Any help and answer will be highly appreciate.