I have a set of data where for every new point $x$ I need to know how probable this point was, given the past data.
I can assume the data follows a linear model.
I know I can find a simple linear regression and use $ \sigma^2 = \frac{1}{N-2}\sum{e_i^2} $ to get the variance of the error term. Then use that normal distribution to find the probability of my new $x$ value.
Now here comes the difficult thing: But what if I can not assume that the model is correct and need to incorporate the uncertainity on slope and intercept?