Why is time series analysis not considered a machine learning algorithm ( unlike linear regression).
Both regression and time series analysis are forecasting methods. So why is one of them considered a learning algorithm but not the other one?
Why is time series analysis not considered a machine learning algorithm ( unlike linear regression).
Both regression and time series analysis are forecasting methods. So why is one of them considered a learning algorithm but not the other one?
As dsaxton notes, "time series analysis" is neither an algorithm nor a forecasting method. It's a field of study. In addition, much of time series analysis is not even concerned with forecasting, but only with understanding the past dynamics of a time series (e.g., change point detection).
Specific time series analysis techniques suitable for forecasting, like ARIMA models or Exponential Smoothing, could certainly be called "learning algorithms" and be considered part of machine learning (ML) just as for regression. They simply rarely are.
I'd say this reflects that time series analysis had already been very well established and developed its own language by the time ML came up, so few time series analysts will think of what they are doing as machine learning (just as few statisticians will think of regression as ML - it's the ML community that classifies established methods under ML nomenclature).
Conversely, the ML community has not really been doing very much with time series per se, and "classical" ML algorithms like neural networks have really not been overly successful in the sense of clearly outperforming the classical time series algorithms for forecasting. If you model your time dynamics in an ML algorithm, you are already pretty close to an ARIMA model, but if you don't, you really miss out on a lot of structure that would help in forecasting.