I want to find confidence intervals for my scenario and I see this sentence in many research papers related to my work. " Confidence interval of less than 1% of the average value"? What does it mean? You can also explain this with the example given below.
E.g. Say for x=100, y= 0.01. This 0.01 is obtained as an average of running the code 10 times for x=100. I will use those 10 values to find the confidence interval at this x=100 point. Hope this is the right way to compute the interval.
Context as asked: I am plotting some blocking value (y-axis) against traffic load (x-axis). X-axis values from 100,200, upto (say) 800. And y-axis values are of the form 0.006 or 0.01 or 0.2 and so on. I run the code to generate y value for the given x and I do this 10 times and take the average. So, each blocking value point vs a given x is an average of 10 times. I want to plot confidence intervals for each x point. So I assume I have to use those 10 values as the sample values to calculate the interval. Is it correct? If so, then I want to know the meaning of the statement I asked at the top in this context.