Is a mean-zero time series simply minusing the average value of the series.
Say I have data $Y=1,1,2,2,3,3$ would a mean zero series simply be, $$y=(1-2),(1-2),(2-2),(2-2),(3-2),(3-2)$$
as the mean of $Y$ is 2?
Is a mean-zero time series simply minusing the average value of the series.
Say I have data $Y=1,1,2,2,3,3$ would a mean zero series simply be, $$y=(1-2),(1-2),(2-2),(2-2),(3-2),(3-2)$$
as the mean of $Y$ is 2?
This operation is called de-meaning or more informally making the series/signal zero-mean. The "de-mean"ed version of $Y$ is surely $y=\{-1 -1,0,0,1,1\}$ as you wrote. But, this operation shouldn't be confused with the definition of zero-mean time series, i.e. a random process / signal is zero mean when $E[X(t)]=0$.
You can make the signal zero-mean by subtracting different values from each entry. For example, $y=\{1,1,2,2,3,-9\}$ is also a zero-mean signal. I just subtracted the sum of your series from the last element.
This is not generally true. While the empirical mean of the series is 2, the second moment of the process isn't necessarily equal to 2. This number is just a reasonable estimate of the mean if and only if the process is stationary and has a finite second moment. By subtracting 2 from the observations you would not necessarily get a mean zero series.
You would get zero mean series if and only if
Note that subtracting an estimate of the mean from the observations is called demeaning. But as the series might not be stationary, the most reasonable estimate of the mean might be time-varying (since $\mathbb{E}[X_t]=\mu_t$ - where the subscript $t$ implies time dependency). Finally, if the condition $\mathbb{E}[X_t] < \infty$ (e.g. Cauchy- distributed time series) does not hold, subtracting any set of finite numbers from the process won't get the process zero mean.