1

In regression model with random regressors

$$y = a + bx + e$$

can I change the equation to

$$x = (-a/b) + (1/b)y + (-1/b)e$$

and consistently estimate $(1/b)$ with OLS?

  • 2
    This doesn't work. The two lines for regressing x on y and y on x are *different* unless the data lie exactly the line. Search on *regression to the mean*. Note that with standardized variables, both slopes will be $\leq 1$; they cannot be reciprocals of each other unless they're exactly $1$ (when the points are collinear). There are a number of posts on site about this already -- e.g. See this post https://stats.stackexchange.com/questions/22718/what-is-the-difference-between-linear-regression-on-y-with-x-and-x-with-y – Glen_b Jan 09 '20 at 03:21
  • 1
    You did very well by including the error $e$ in your statement of the model, because you have shown explicitly what fails in the new equation: the variance of the error $-e/b$ changes according to the unknown slope $1/b.$ Thus you would need to weight the data by the values of $b^2$--but you don't know $b.$ It's especially dicey when $b$ is close to $0$ (which is a standard null hypothesis in the original model). – whuber Jan 09 '20 at 20:26

0 Answers0