I have a random variable estimated over time by an online algorithm. I have the mean and variance of the random Gaussian variable at every step t. I expect the time series to have sudden shifts. What is the best way to estimate if the my estimated variable has shifted beyond a known threshold using t and t + 1 estimates?
Thank you.