I have a large number of time series which consist of pricing data of consumer goods. As expected the prices show trend and seasonality. However my main problem is to detect large level changes in the price series. In these data sets this has occurred mainly due to the price being recorded for the wrong packaging size. For example instead of the price for a can of beer being recorded, the price for a six-pack has been recorded.
For any given time series it is likely the case that most prices are correct, but a series of observations may be wrong. Most likely the errors are at the beginning of the series but not necessarily.
This is easy to pick up when the time series is plotted and can be modeled with a dummy variable.
However I would like to automatically detect the location and magnitude of the level change. How can I go about this ? Unfortunately my stats education ended at the undergrad level so I'm not sure where to begin ?
The trend and seasonality that exists in my time series is not my main concern. Do I need to worry about autocorrelation etc ? Or should I just be worried about level changes.
Although I have access to R, ultimately the algorithm may have to be implemented in Java.
I'm not sure if this is an appropriate place for the question, but I hope someone can help me !!