1

What's the correct term for regression where you first regress on one input variable (feature), take the errors, regress on the next feature, etc.? In what specific cases is this useful? Are there any references for this technique?

More specifically, I am talking about regressing $y$ on $x_1$, then regressing $(y-b_1\times x_1)$ on $x_2$, etc. This is because each $x$ is a time series, and needs to be sampled differently, so I cannot take them all as inputs to the regression at the same time.

Alexis
  • 26,219
  • 5
  • 78
  • 131
Baron Yugovich
  • 515
  • 1
  • 6
  • 18
  • I'm not sure if there is such a thing exactly, but this sounds a bit like [partial least squares regression](https://en.wikipedia.org/wiki/Partial_least_squares_regression) & a bit like the way [added variable plots](http://stats.stackexchange.com/q/125561/7290) are constructed. – gung - Reinstate Monica Dec 05 '15 at 04:22

2 Answers2

0

Taking some liberties, you are probably talking about the instrumental variable techniques, or IV. They are used to help argue causality and to control for endogenous variables. Commonly part of econometrics.

Alternatively, you may be thinking of the FWL theorem

RegressForward
  • 1,254
  • 7
  • 13
  • 1
    No. I am talking about regressing y on x1, then regressing (y-b1*x1) on x2, etc. This is because each x is a time series, and needs to be sampled differently, so I cannot take them all as inputs to the regression at the same time. – Baron Yugovich Dec 07 '15 at 02:09
0

Are you talking about 'step-wise regression'? However, in step-wise regression, we regress response on one variable, then we regress response on the new selected variable as well as the previous variable, i.e. regress response on two variables. Next, we regress response on three variables and so on.

user4704857
  • 502
  • 3
  • 12