I'm trying to estimate performance results of different configurations. In each test one machine is generating requests to a server for x minutes.
The output is: 1. Number of attempts 2. Number of successful requests 3. Time of each request
My problem is that (as an example) most of the requests take about a second or less and then there are a few requests that take 120 seconds.
I need to make a clear and simple output graphs. So:
A. Is there a proper way (formula) to "ignore" results that are larger than x? I can simply omit some results from the average but was wondering if there is a more elegant way to add it to a formula.
EDIT: Deleted the second question.