2

I have a model that depends linearly on $v$ and $\alpha$, but not linearly on other two parameters $T_0$ and $T_1$: $f(i; v, \alpha, T_0, T_1)$.

Using least squares, I can solve for $v$ and $\alpha$, getting a solution that depends on $T_0$ and $T_1$. Next, I am thinking in using the sum of squared residuals as objective function in an optimisation problem to find $T_0$ and $T_1$.

Because the residuals have different variance, I am wondering if my objective function should be the sum of squares of residuals, or of the studentised version of them.

What would be best?

Context: The problem I am facing is more complex; a simplified version of it is that I am trying to fit a line $\alpha \, t + v$ to certain data, but just within a time bracket $[T_0, T_1]$; the measurements (assumed) being stable outside that period. The unknown time bracket is the non-linear dependency. If the above approach is not reasonable, any comments would be appreciated.

Other way to put it: this is a piecewise linear regression (!)

Update

I have found a number of posts in here about piecewise regression with free knots, as it is my case. For example this.

My situation is more complicated: the final purpose is to fit a piecewise linear function to the given data where the number of knots is unknown. I am thinking in using a multi-scale regression algorithm, and I hope that I can reduce this problem to that of splitting a segment in two (with arbitrary knot) to have better fit - thus, turning this into a (local) model-selection problem. If anyone can give me pointers to literature, it would be much appreciated.

carlosayam
  • 110
  • 9

0 Answers0