I'm wondering if it's appropriate to use factor analytic methods in the study of the following case:
I have time series data describing the incomes of 12 companies. Each company has three main sources of income, so 36 time series in total. I have about 30 data points for each of these series (quarterly data).
It's hypothesized that the incomes are affected by certain market indicators. Approximately 50 potential indicators are proposed, mainly various stock market indices, currency exchange rates, etc. For each of them, data is freely available (at least monthly data).
The objective is to be able to forecast each of the 36 company income values. But 30 data points are not enough for 50 potential variables, so the idea is to use factor analysis to extract the relevant underlying factors and narrow it down to a manageable amount. Then the 30 data points might be enough for a regression.
- Is this approach reasonable? Could it be improved somehow?
- From what I read, factor analysis methods aren't exactly designed for time series, but can be extended to include time series. I looked it up and found two approaches: DFA and TSFA. Which one would be better in this case?