I am trying to find an efficient algorithm that gives the capability to calculate the time series slope for each point over a time window.
For example,
The time serie has 1 Million points, and the window length is 10000.
I noticed that it is very inneficient to use the least squares method for each point - since that for each point, it needs to calculate the SumXY, SumX, and SumY as well as the SumX^2.
I tried to optimize it and instead of summing x, xy,y and x^2 over 10.000 points, I do a minus/plus for each incrementation- but it still is really slow.
I would like to know if there is not already an algorithm out there that has been published on https://www.jstor.org/ that solves this problem. I did look into it, but there are thousands of papers on linear regression - as I got lazy, I though that might be easier to find what the algorithm name could be by asking the experts directly.