2

I have a set of data where for every new point $x$ I need to know how probable this point was, given the past data.

I can assume the data follows a linear model.

I know I can find a simple linear regression and use $ \sigma^2 = \frac{1}{N-2}\sum{e_i^2} $ to get the variance of the error term. Then use that normal distribution to find the probability of my new $x$ value.

Now here comes the difficult thing: But what if I can not assume that the model is correct and need to incorporate the uncertainity on slope and intercept?

tzeH
  • 121
  • 2
  • 2
    Hmm, thanks to the "related" questions I found this possible solution: http://stats.stackexchange.com/questions/33433/linear-regression-prediction-interval – tzeH Nov 23 '12 at 13:34

0 Answers0