I have a simple JMeter throughput test that spawns 20 threads and each thread executes a simple SQL query against a database. I have set a ramp up time of 10 seconds and a total test time of 70 seconds.
When I execute the test in non-GUI mode I see the following summary output:
summary + 1 in 0.1s = 7.4/s Avg: 135 Min: 135 Max: 135 Err: 0 (0.00%) Active: 1 Started: 1 Finished: 0
summary + 137501 in 28.5s = 4831.0/s Avg: 3 Min: 1 Max: 614 Err: 0 (0.00%) Active: 20 Started: 20 Finished: 0
summary = 137502 in 29s = 4796.9/s Avg: 3 Min: 1 Max: 614 Err: 0 (0.00%)
summary + 171000 in 30s = 5703.8/s Avg: 3 Min: 1 Max: 519 Err: 0 (0.00%) Active: 20 Started: 20 Finished: 0
summary = 308502 in 59s = 5260.8/s Avg: 3 Min: 1 Max: 614 Err: 0 (0.00%)
summary + 61016 in 11.5s = 5309.0/s Avg: 3 Min: 1 Max: 518 Err: 0 (0.00%) Active: 0 Started: 20 Finished: 20
summary = 369518 in 70.1s = 5268.9/s Avg: 3 Min: 1 Max: 614 Err: 0 (0.00%)
As you can see that the throughput is low in the first 30 seconds but picks up later. I understand that this could be due to the threads starting and other system components warming up.
Is there a way I can exclude "X" seconds worth of starting numbers from the final calculation. I have used some custom performance tests where I always excluded the first "X" seconds until the system reached a steady state before measuring the output.
Is there anyway I can do that in JMeter?
See Question&Answers more detail:os