1

I am trying to find an efficient algorithm that gives the capability to calculate the time series slope for each point over a time window.

For example,

The time serie has 1 Million points, and the window length is 10000.

I noticed that it is very inneficient to use the least squares method for each point - since that for each point, it needs to calculate the SumXY, SumX, and SumY as well as the SumX^2.

I tried to optimize it and instead of summing x, xy,y and x^2 over 10.000 points, I do a minus/plus for each incrementation- but it still is really slow.

I would like to know if there is not already an algorithm out there that has been published on https://www.jstor.org/ that solves this problem. I did look into it, but there are thousands of papers on linear regression - as I got lazy, I though that might be easier to find what the algorithm name could be by asking the experts directly.

  • There is an algorithm: it's called Loess (or Lowess). In your case, because you appear to weight all points in the window equally, a significant speedup can be obtained using standard updating methods, such as described at https://stats.stackexchange.com/questions/6920. I believe that algorithms for "geographically reweighted regression," or GWR, are substantially the same as these. – whuber May 20 '20 at 12:52

0 Answers0