I have 522 observations of a time series that shows autocorrelation until about 150 lags. How do I test if the mean is significantly different from zero?
My idea was to fit an intercept-only regression model and then use the Newey-West adjusted variance to calculate a t-statistic, however I don't know what cutoff to use for say 95% confidence. It just doesn't seem right to use the standard normal cutoffs.
The series is a rolling compound return of financial returns. So for instance the first entry is returns from weeks 1 through 252.The second entry is 2 through 253, and so on, until there are 522 such entries. I am looking to test whether the mean of such a series is zero or not.