What are some good ways of quantitatively characterizing the periodicity level of an approximately periodic $f(t), t \in \mathbb{R}$ signal?
I need this to tell if the output of some system is periodic or not, and to be able to decide the output of which system (out of a given set) is "more" periodic (approximates a perfectly periodic signal better).
I'm interested both in purely mathematical definitions for $-\infty < t < \infty$, and in practical ways of calculating this periodicity measure for a finite and time-discretized dataset. (Note that I expect that a useful measures might give higher periodicity for longer samples of the same signal for obvious reasons---but I'd like something that works for a series of any length.) The measure should (theoretically) be the same for infinitely long perfectly periodic signals, regardless of signal shape.
EDIT: Sample signals: http://ge.tt/91Zniy9
These signals are always between 0 and 1. I realized that the amplitude is also very relevant for my particular applications (very low amplitude signals should not be considered "periodic"), but any solution is easily modified to include this. I'm mentioning it to point out that solutions that are explicitly amplitude-dependent are also useful.