0

Apologies if this question is too simplistic but I really couldn't find an answer for it on stats or math stackexchange.

Considering the simple linear regression (sum of least squares) for bivariate data.

We can get a simple linear regression by one of two methods:

  1. Considering x as an independent variable and y as dependent. (i.e. Y on X)
  2. Considering y as an independent variable and x as dependent. (i.e. X on Y)

Since the simple linear regression of Y on X can be expressed in following form

$y=\alpha_{yx}+\beta_{yx} x$ provided that $\sum_{i=1}^n (y_i - \alpha_{yx} - \beta_{yx} x_i)^2 \rightarrow min$

and similarily for X on Y

$x=\alpha_{xy}+\beta_{xy}y$ provided that $\sum_{i=1}^n (x_i - \alpha_{xy} - \beta_{xy} y_i)^2 \rightarrow min$

I interpreted the second case of y being the independent variable to be the same as plotting the values of y on the horizontal and x on the vertical. (if this is wrong then all that follows is wrong so please correct me)

Would it then, be correct to say that the linear simple regression as calculated for Y on X can be transformed to the case of X on Y by simply taking the vector associated with the line of best fit and (first) rotating it 90° counterclockwise and flipping it along the vertical axis?

I have tried to illustrate this with the following diagram. enter image description here

This process can also be achieved by simply multiplying the first matrix by $ \begin{bmatrix} 0 & 1\\ 1 & 0 \end{bmatrix}$

Is this method of thinking correct or have I misunderstood something about simple linear regression? I wasn't able to clearly explain this doubt to my prof and its been nagging me ever since.

Would greatly appreciate any and all input.

Bhoris Dhanjal
  • 223
  • 1
  • 6
  • I found the duplicate by searching [regression x y](https://stats.stackexchange.com/search?q=regression+x+y), where you can find additional threads on this subject. – whuber Mar 03 '21 at 16:53
  • Yes I should have probably mentioned that I already read this post https://stats.stackexchange.com/questions/22718/what-is-the-difference-between-linear-regression-on-y-with-x-and-x-with-y. As it was the first result on google. But I don't feel it adequately answered the question as I have asked it. In terms of a vector transformation. – Bhoris Dhanjal Mar 03 '21 at 16:56
  • It's hard to find anything to explain. On [math.se] they will happily--and quickly--tell you that switching $x$ and $y$ axes is a linear transformation. – whuber Mar 03 '21 at 16:59
  • Well yes, I know its a linear transformation. Maybe I didn't present it clearly. My question was provided we perform this linear transformation for the line obtained by y on x. Is it then mathematically equivalent to the simple linear regression for x on y. – Bhoris Dhanjal Mar 03 '21 at 17:02
  • That's what the duplicate shows! (There are, of course, a few exceptions related to regressions with zero slopes.) – whuber Mar 03 '21 at 17:04
  • I have read the linked post once again. I know that the simple linear regression calculated in both the cases are different. The accepted answer states that the slopes are identical for the standardized variables. And the comment on the accepted answer mentions that the slopes for the un-standardized variables are reciprocals of each other. Yes, this is similar to the question I have asked. But the reason I made this post was more for because I was looking for a more discrete proof of the statement in terms of linear transformations (for the unstandardized variables). – Bhoris Dhanjal Mar 03 '21 at 17:20

0 Answers0