Today I was introduced to the definition of stationarity being that the marginal distribution of a process does not change over time, and the mean and variance remain constant over time. I questioned whether a process with a perfect sine-wave with an unknown starting point and unknown period would be stationary, as it is purely deterministic however the mean seems to change with time.
Asking my professor following class, his answer was that such a process would indeed be stationary, despite the mean changing. However on this answer I found on CV, the mean changing is cited as a reason for such a process being non-stationary. Now I'm a little bit confused about the definition of a stationary process, and still unsure about the stationarity of a sine wave.
Am I missing some point here? Or is there varying definitions for a stationary process?