Is it at all possible to fit a probability density function to a time series in R (with already existing functions)? I have a large set of time series describing frequencies of Google searches over 60 time points. What I want to do is fit several pdfs to every one of them in a loop with the fitdist() function from package fitdistrplus: Weibull, Inverse Weibull, Lognormal and possibly others.
Of course, this sequence data is ordered and this order gets lost when fitdist() makes a pdf out of the empirical distribution of the search frequency values (which of course then looks completely different than the distribution over time).
Is there a way to use fitdist() in a way that retains the time index? What I did up until now was just map the search frequencies at a time point to frequencies of that time point in a new variable (so: if there are 3 searches at time 2, the value 2 is represented 3 times in this new variable).
But the search frequency data is averaged, so I fear that I will introduce a great deal of bias by mapping to integer counts. Are there any other solutions for fitting time series data to a probability distribution?
P.S.: I know this question is similar to this one, but it is already a year old with no answers and I'm more concerned with how to do it in R specifically.