Browse by Domains

Managing Size of Automation Test Suite Using SQL and Visualization Tools

Table of contents

Hi, I’m Tapas and working in a supervisor role wherein my duties include tracking and monitoring test deliverables for an ERP. As part of my current role, I need to work with data for various decision-making activities. Therefore, I have decided to take it one step further and learn the concepts and techniques behind Data Science and Business Analytics.

It became crucial to control the size of the automation suite i.e. the number of tests in our test repository. Approximately 300 tests were getting added to the suite in each release. With this rate, the test suite size became huge over the years. The time required for a run and debug failures was becoming a bottleneck and time-consuming. There was an ask to reduce the test suite to optimize run time and better resource utilization. Automated execution was taking more than 3 days to complete. The debugging was taking almost a week’s time for the team. This has resulted in blocking the capacity of resources in test management activity, which otherwise could have been used for increasing test coverage. 

Automation execution results, pass percentage, and failure rate for each script were extracted as a dump from our in-house test scheduling and orchestration database. This data was broken down into a granular level. Tests were split into various functional categories and sub-categories. Data was collected corresponding to the usage of each category and sub-category by customers. Finally, the tests were organized based on the importance of functional categories and customer usage. Similarly, tests were classified based on the year of creation, the number of failures reported by those tests over the years etc. Finally, the less significant tests were identified and suggested for removal from the main automation suite.

We used SQL to extract the data. Excel and some in-house BI and visualization tools were used for grouping and segregation. The tools were selected based on the size of the data, compatibility with the requirements and organizational software usage policies. Data collection was the biggest challenge was the source data was residing in multiple repositories. Once the data was collected, it was joined and concatenated to obtain the final dump which was taken for analysis. Look-up tables were created for adding additional information to the data e.g. Adding customer usage by functional categories. This was time-consuming as the detailed analysis was involved. EDA was performed to gather information related to customer usage patterns for different functional categories, tests available under each functional category, and trends of issues reported by tests over a certain time period. The main challenge was to arrive at deciding criteria for the elimination of tests, where EDA played a vital role by providing different insights and statistics based on past execution results.

Several insights were derived such as 10% of tests were found across the repository which was a kind of repetition which few additional steps. Those were marked duplicated and the additional steps were accommodated in existing scripts. 10% of tests were found to be belonging to very stable areas and never reported an issue over the last 3 years. The number of those tests was reduced by 20% of the tests which belonged to lesser-used functional categories were moved out of the main repository and parked in a separate suite for on-demand execution only. Spending more time in analysis before automating the tests and combining scenarios wherever possible to reduce duplication. The automation was less in the stable areas. There was less focus on low-used functional areas and automation only a few critical end-to-end scenarios in those areas. Once this solution was implemented, it helped in reducing the size of the test repository by 40%. 30% capacity of human resources was freed up on a monthly basis. 

The exercise helped us visit the repository thoroughly and helped in getting a good grip on existing test coverage. Technically, we got a good hands-on opportunity to work on SQL joins, excel and data visualization to draw meaningful results.

Avatar photo
Great Learning
Great Learning's Blog covers the latest developments and innovations in technology that can be leveraged to build rewarding careers. You'll find career guides, tech tutorials and industry news to keep yourself updated with the fast-changing world of tech and business.

Leave a Comment

Your email address will not be published. Required fields are marked *

Great Learning Free Online Courses
Scroll to Top