Still remember the day when managers from my consulting firm decided to move the operations from expensive licensed tool to less costly solutions providing same features. We a bunch of three performance testers were then handed over the responsibility of moving the performance testing activities from LoadRunner to AgileLoad.
We started the task with requirement gathering task wherein we tried to understand the critical project expectations from the tool. We studied different applications over which testing was performed, the technology associated with those applications, their architectures and the protocol that communicated between the clients and the servers of those applications. We then started looking at the different opensource and other less costly solutions available in the market and started to study those. WE created a tabular matrix that included all the critical features that we were seeking and the ones provided by those tools.
Some critical features that we sought were –
- Availability of record and replay functionality in the tool
- Infrastructure requirements for the different tools
- Number of concurrent user test that the tool could support
- Ease of scripting and test execution for the tool
- Ease of migration of script and other test artifacts that were built for loadrunner to that to other tool
- Popularity of tool among the performance testing community
- Result accuracy
- What results and graphs different tools provided after the test
- Compatibility of tool with the server monitoring tools (we wanted to spend on these tools as well)
- Variety of different protocols that the tool supported
- Availability of inbuilt functions and their usage in scripting
- Ease of data creation, manipulation and usage during the test
- Compatibility of the tools with Windows servers
- Team’s skill sets and ease in learning the tool
The table gave us a clear snapshot of different functionalities of different tools and helped us shortlist a few tools that we could act as a good fit for the projects’ requirements. Some tools that we shortlisted include – Pylot, AgileLoad and Jmeter. The next part of the project was to use these tools on real time applications and projects and test its effectiveness, usability and understand its reliability. Removing Pylot from the list was easy, the team faced a lot of issues in learning a new scripting and we noticed that the tool wasn’t efficient for ‘https’ based applications. Most of our internal applications were web-based (SSO Authentication).
Selecting one of AgileLoad and Jmeter was a lot difficult that what we expected and it consumed a lot of time. Both provided compelling reasons to be used into the project. With Jmeter, we were successful in running a 500 user test but were able to run a 1200 concurrent user test with AgileLoad (with no blockers / issues during the test). Moreover result analysis, data manipulation, server monitoring and text check features of AgileLoad came up with more capabilities. Moreover while comparing the results of tests executed with AgileLoad with those with the ones with the production scenarios, we observed that the results were more reliable. Overtime, we observed that AgileLoad gained more popularity amongst the team members because of its simplicity and increased bandwidth in the functionalities.
This was how, in brief, we planned to move from Loadrunner to AgileLoad and helped the management reduce the annual cost on tool by 80%.