In Weighted Least Squares we assume that $Var(ε_i|x_i)=σ^2\times x_i$. Then we divide our regression model $yi=a+b_1x_i+ε_i$ by $\sqrt{(x_i)}$, such that the "normalized" model variance is $Var(ε_i/\sqrt{(x_i)}|x_i)=1/x_i\times Var(ε_i|x_i)=1/x_i \times σ^2 \times x_i=σ^2$.
Why we treat $x_i$ as it is constant in this case and actually use the property $var(aX)=a^2 \times var(X)$. Is it due to the assumption of linear regression that $x_i$ considered fixed in repeated sampling, thus non-stochastic, thus constant?
Is it the same reason in linear regression that xi considered constant, such that there is no covariance between $x_i$ and $ε_i$? \begin{align} Var(y_i)& =Var(a+b_1x_i+ε_i) &\\ & =Var(a)+Var(b_1x_i+ε_i) &\\ & =Var(a)+b_1^2Var(x_i)+Var(ε_i)+2b_1Cov(b_1x_i,ε_i) &\\ & =0+0+Var(ε_i)+0 &\\ & =σ^2 \end{align}