It is key to understand what to look for in the results, the primary goal is to understand if the application reached its performance goals or not, after that other types of performance testing such as stress & soak testing may be carried out and in the case of the application not reaching its goals, bottleneck analysis should ensue.
Application Performance requirements could manifest themselves in a number of ways, however, these will boil down to three main points. The diagram to the left tries to explain that these points are all related to each other and that they all relate directly to application performance.
Response Time: This is the time it takes for the web application to respond to a user request and hence is the amount of time a user will wait while the application is under load.
Transaction Rate: The number of actions the application can manage while it is under load.
Concurrent Users: The number of users the application can service while under load.
All of these artifacts are related to each other, for example, the transaction rate should rise as the number of concurrent users increases (more users demanding more activity), however the transaction rate can also decrease with an increase in the number of users if the system becomes overloaded. Response time should remain constant but if the transaction rate and / or the number of concurrent users reaches a point at where the load overloads the system then response time will start to degrade. Mapping response time to the number of concurrent users is a very efficient way to evaluate whether the system is handling the load well or on.
The final report is vital in communicating the findings from performance testing. By following an iterative approach there should be no surprises in the final report. In fact there may have been several draft reports created for each of the result sets before the final report is compiled.

Reporting in AgileLoad is made very easy with its automatic reporting underpinned by the report designer which enables relevant information to be reported on quickly. There are several ways to look at the results from the tests as follows:
During Execution, while the test is running using AgileLoad Center it is possible to look at the results as they are being generated, this is a good way to understand quickly how well the test is going and to make a decision about whether to continue the test or to stop for tuning or adjustments before restating.
Analysis after the test is complete using AgileLoad Center the results are stored by Job name and then by execution date. Everything that was available during the run time reporting is available after the test along with some further details.
Report: Using a report template (there are some templates that come ‘out of the box’ with AgileLoad, or analysts can create their own templates) is a good way to pull out the relevant information to be presented to the project team. The reporting function can be used to generate reports after each test if desired (as it only takes a few seconds to generate a report). This way stakeholders become used to reading such a report. For the final report the analyst should write into the generated report their comments, this might include an executive summary and some observations for each graph.
AgileLoad can be used to monitor infrastructure such as CPU’s, Database performance and so on. If the test has been set up with monitoring then the monitors make up an important part of the reporting and can be instrumental in identifying bottlenecks. This section will not cover monitors. Please see the advanced section for that.
Next Monitoring a Job with Agileload Center