Apologies if the term regularity is improper, I cannot find what the actual term I am looking for is.
I'm looking for a method for calculating how "regular" rhythms are for musical notes.
Although, more generally, I'm looking for a method for calculating how "regular" the gaps are between numerical values.
I have data in the form of an array, which contains the milisecond occurences of the notes, i.e.
[1000, 2000, 3000, 4000, 4500, 5000]
I have tried (unsuccessfully) methods such as getting the difference in time between a note and the next note, then calculating the standard deviation of that data.
However, this produces nonsense answers, especially in cases where there are breaks, for example.
[1000, 10000, 11000, 12000] // 2178.013025842761
[1000, 1333, 1500, 1666, 2000] // 73.030815413769
So this is definitely not the way about it.
In the top example, all of the deltas share common multiples. However, in the bottom example, significantly less do.
I'm asking as I'm wondering if there is some statistical term for the measurement I am looking for, and also if there are any solutions to this problem (perhaps involving common multiples).
I do not have a very advanced background in statistics, and what I'm asking for could be completely incoherent. Apologies, and thanks in advance.