3

What would be some practical scenarios where we collect data at time-points $g(t_1), \ldots, g(t_n)$, where $g$ is an increasing function? For example, $g(t) = \exp(t)$ or $\ln t$.

To be more clear, in which real-life scenarios we consider dataset $\{X_{g(1)},\ldots, X_{g(n)}\}$ instead of $\{X_1, \ldots, X_n\}$?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
Joy
  • 313
  • 1
  • 16

2 Answers2

2

Any irregular time series fits your description. These are time series where observations are not taken in equally spaced intervals, but at irregular ones - but the sequence is still taken in time order.

As an example, I offer the time series of my weight over time. Yes, I am the kind of person who weighs themselves every morning and then documents it - over four years. The "irregular" aspect comes in because I sometimes travel and don't get to my bathroom scale for a couple of days. Look closely and you will see it. (I removed the vertical axis annotation to forestall inappropriate comments.)

irregular time series

Stephan Kolassa
  • 95,027
  • 13
  • 197
  • 357
  • I think what I asked is different from irregular time series. In an irregular time series, at a given time-point, we can not say exactly when the next observation is going to be observed. But, for my scenario, if I am standing at the time point $\ln (2)$, I expect the next observation to arrive after $\ln(3) - \ln(2)$ time-unit. I want observations in unevenly spaced time-points, but, unevenly spaced in a deterministic way. – Joy Jan 30 '19 at 19:50
0

I could think of a scenario whereby you have an "IoT"/Edge/Fog device that gets updated with trained models to detect features. Initially, while untrained, you may want to transmit a lot of live data. Once you train models (say, for anomaly detection), the need for large amounts of data may drop.

You may still want to collect data over time, though, to avoid a deteriorating real life thing, which changes its behaviour over time, drifitng out of the trained dataset, but you would gradually move to a "sanity" check. One way could be, switching from live full up data feeds to bursts, another to increase sample rate, that would depend on the application.

So, instead of measuring the temprature in an airconditioned room every 10 secs, after you acquired these data for a year (for the sake of an example), if your model was good enough after it was trained for one full seasonal cycle (half a year would be sufficient most of the time), you could could decrease sample rate thus saving transmission cost, or increasing bandwidth.

KlausGPaul
  • 21
  • 3