Url Redirector Modified

What are the Challenges in Performance Testing?


Inadequate non-functional requirements for performance testing: Clearly defined requirements are must to validate if the system is developed as per the expectations. It not only gives right direction for the software engineer to code the functionalities but also helps the software tester to validate the quality. In simple terms, functional requirements define what all different activities system should perform, what all user inputs system can or can’t accept, what algorithm to to perform actions etc...On the other hand, non-functional requirements detail acceptable responsiveness, usability, scalability, stability, operability, capacity and many more non-functional aspects of the system. The common challenge in software development and testing is inefficient non-functional requirements whereas functional requirements are good enough to engineer a solution. In most cases, non-functional requirements will be very vague - can not be quantifiable or qualifiable. For instance, response expectation is “just acceptable” instead of specifics like 5 seconds or scalability requirement only says 100 users instead of 100 concurrent users generating 10 transactions per second. In summary, not having enough details around non-functional requirements pose challenge to performance validation.

The suggested solution here is  to spread the awareness of non-functional aspects of the system ahead of software development life cycle by taking part in requirement gathering sessions. One of approaches here is to explain the business or financial impact if non-functional requirements are not defined. One thing to keep in mind is that non-functional performance  requirements are technical in nature and business may not be able to give lot of technical details. So business interactions can be done by asking questions like How many users system should support? What is the business impact if the system goes down when more than expected users are accessing at the same time? What is the responsiveness tolerance? What is the expected usage of the system 5 years down the line? The answers to these questions can give very high level information about the performance requirements. Once the answers are provided, it can be discussed with the technical team and refined further. Technical team can give more specific details around what performance technology can offer like response times for view operation, response time for add operations, server capacity requirements etc..So combining the inputs from business and technical team, performance requirement challenges can be resolved. 

Inability to customize the tool output to develop load generation script: Once performance requirements are defined, the most critical task in performance testing is to design and develop the test. The most important activity in this phase is to develop a load simulation technique by leveraging the record/replay options in load testing tools. And this record and replay option depends on the application protocol involved in client – server communication and generated the script based on that. Example: For http communication, script will be generated with url with query string or form data. In most of the cases script recording is relatively simple if the recording engine is able to capture the client/server traffic and translates in to readable ASCII/text script. The challenge here is that if the recorded data is not directly convertible to ASCII text, load testing tool will generate the script which will not be useful to customize the script and use for load generation. Some of the common instances are

-   Applet client talking to the server over http using serialized Java Objects. Real world example: Documentum UCF clients talking to the content management server
-   Flash client talking to the server over http using serialized Action Message Format (AMF). Real world example: Kronos flash client talking to the Gateway server
-   JMS client connecting to the server over tcp ip. Real world example:  TIBCO JMS client talking to the TIBCO EMS server
-   Mobile application talking to the server over encrypted http. Native Mobile application talking to Sybase Sybase SQL Anywhere/ Sybase Unwired Platform(SUP) server
 
There are 2 solutions to the above mentioned challenge. The first solution is look for any out of the box support in the load testing tool which can translate the binary network data into readable format by giving the application specific libraries. In this way, by selecting the right protocol and providing the right client application libraries, the tool will translate the recorded binary data into readable script. Example: “Java over Http” protocol in LoadRunner for serialized java communication and “AMF” protocol in LoadRunner serialized AMF transfer.  If the tool doesn’t support translating network data into readable format, record/replay is not a solution, rather the solution here is to identify the client side API and code them as your script. It requires knowledge of application logic and programming experience hence additional effort. Example: Identify JMS APIs in Java and code it in LoadRunner or any other load testing tool. Another example is for for mobile application where Sybase SQL Anywhere APIs can be coded to test the SQLAnywhere server. The same approach can be followed where out of the support is not available. Mobile app talking to Sybase Sybase SQL Anywhere/ Sybase Unwired Platform(SUP) server

Scaled Down Performance Test Environment

Performance Testing will be typically performed well before the go live date and all the performance objective will be based on the capacity of the production environment. Ideally, performance testing needs to be carried out in production like environment to identify or predict the performance bottlenecks and solve it. But maintaining a production like environment is expensive and most of the companies may not have the budget to do that. The end result is scaled down Performance Testing environment. This is one the most frequently experienced challenges in Performance Testing.

The best way to handle this issue is to scale down the Performance testing objectives based on the scaled down capacity of the performance environment. It is not that easy. As mentioned in Performance Factors, there are multiple components contributes to the performance. If there are multiple variables in performance environment compared to production, scaling down the performance objective will become real hard.

Test Data

Most of the testers do not realize the importance of test data. But it becomes a big challenge when it approaches the test execution phase. Test data can be of two different types. One is the data used by load test scripts. And another one is the data in database.  Test data used by the script should be similar to data used by the end user.

For example: If the script is to simulate 10,000 deposit transactions to a banking application, unique account numbers need to be used in the script. It becomes more challenging to repeat the same test.

 

Solution: To repeat the test with the same set of data, backup the database and restore it before each test. In this way, test data can be re-used and test can be repeated without putting additional effort.



Performance Testing Fun