Suppose I have a model $y_i = \beta_0 + \beta_1 x_i + e_i$ but instead I estimate $y_i = \beta_1 x_i + u_i$ using OLS. That is, I ignore the intercept. Working out the algebra, based on this post, we have
$\hat{\beta}_1 = \beta_1 + \beta_0 \frac{\sum_{i=1}^n x_i}{\sum_{i=1}^n x_i^2} + \frac{\sum_{i=1}^n x_i u_i}{\sum_{i=1}^n x_i^2} $
I know how to care care of the third term, but can you help me verify if the following is correct?
I can write
$$ \frac{\sum_{i=1}^n x_i}{\sum_{i=1}^n x_i^2} = \frac{\frac{1}{n}\sum_{i=1}^n x_i}{\frac{1}{n} \sum_{i=1}^n x_i^2} $$
Using the continuous mapping theorem, the numerator converges in probability to $E[x_i]$ and the denominator $E[x_i^2]$.
Suppose I demean the data. Then $E[x_i] = 0$. Does that mean $\hat{\beta}_1$ is consistent even if I ignore the intercept as long as I demean $x_i$ but not $y_i$?