0

Maybe there is already a question similar to mine but there are so many involving the term accuracy and at least none, except of How to evaluate instrumentation accuracy? , didn't seem "very" similar.

A simple question: The task is to measure a device and f(x) has to be determined under a given accuracy. For example, there is a voltage measurement device and you have to check if the output voltage is better than 0.1 V precise. How to proceed that? Is it possible one with the help of the data points? I assume it isn't that trivial as probably several errors have to be taken into account? EoV maybe, too?

Ben
  • 2,032
  • 3
  • 14
  • 23
  • 1
    This is very vague. Are you asking about a confidence interval? – user2974951 Jul 15 '19 at 07:04
  • Maybe, I forgot to ask if a confidence interval would represent an estimation accurancy? Means, I would need to fulfill a given confidence interval instead of performing some methods and gather which confidence intervals they do have. Is that even possible? Or do I have to trial and error and hope that the confidence interval is as required? – Ben Jul 16 '19 at 06:18
  • A confidence interval would tell you the range of possible values of your estimate. So for ex. if your estimated accuracy would be 0.5 and your 95 % CI would be [0.4,0.6], that would give you a sort of range of likely possible values. – user2974951 Jul 16 '19 at 07:36
  • thanks, but I won't know this before I did everything already, or? – Ben Jul 16 '19 at 08:07
  • You estimate the CI after the fact, just like any other estimate. Maybe you are looking for some sort of power analysis? – user2974951 Jul 16 '19 at 12:10

0 Answers0