Let's say I got a Gaussian Process model $M$ based on some training data. Now I get a stream of sample data of a certain batch size coming in.
The GP does not model a time series, but it's trying to regress the value at certain locations $x$, that will be visited mutliple times.
I know that at some point there will be an abrupt change in the distribution the data batches are generated from. (At least on certain locations $x$)
Now I'm looking for a statistically sound way of detecting this change, that is I want to find the point, when the GP doesn't model the data well enough anymore.
I thought I would base this on detecting "big" changes in the likelihood function $L(\theta | x)$. However I'm not sure how to interpret "big" here, as the values of the likelihood function only have a meaning in comparison, but on their own.
Note: I'm asking this, because some people said my original question was too abstract. I didn't want to change the whole question, though, as there already were some answers.