This depends on what you think the scale variance of your number are. If you think that the absolute variation about the mean should depend on the mean (e.g. $10 \pm 1$ is about the same as $100 \pm 10$), then CV is a fine choice because it will control for this. If, however, you think that the amount of variation shouldn't depend on the mean ($10 \pm 1$ is about the same as 100 $\pm$ 1), then dividing by the mean is not desirable, and you should use something like variance or standard deviation ($\sigma$). A 95% confidence interval is given by $\mu \pm 1.96 * \sigma$. You could even use the range (max - min), though that will be more susceptible to outliers.
As far as whether a particular statistic is "big" or "small", that depends completely on what you're measuring and how you feel about it. Standard deviation will be in the same units as your measurements, CV will be a proportion (percent if you divide by 100). You say your data is for time elapse. A 20% CV could be fine for something where you expect wide variation, maybe how long a whale stays underwater, but it sounds very high for something that should be regular. Similarly, a standard deviation of 1 minute for how long it takes me to walk from my office to another building seems reasonable, about the same. If it is 1 hour, I'm less happy about the situation.