-

Yielding Test Architecture Quality for a US based Technology Firm to Complement Business Growth

about client

The client is a privately held, US-based technology company who is maximizing device life and creating economic incentives for all participants in the ecosystem—providing web-based solutions in the market related to trading and managing used hardware peripherals. They play a pivotal role in the development of the global economy and present incremental growth opportunities for carriers and businesses alike.

Product Overview

The developed product covered the following functionalities:

  • The web-based channels cover all the basic three categories of trading, i.e., B2B, B2C, and instore.
  • Along with the trading, this does include an automated way to move all the traded devices from the customer's pockets to the processing center and is responsible for all types of the necessary processing.

Background

In the first place, our client gave the project to us with a thought of a multiple vendor mechanism, where the intention was to have the feeling of competition between vendors so that the quality of the project should improve. During our trial phase, we analyzed existing test architecture and test suites, we found various challenges. We, as a process-oriented organization, showcase our best implementation of processes that lead to a bug-free delivery of the product, even before time. While discussing with the client the challenges that we found in the previous vendor's test infrastructure or processes, we learned client has been facing that pain for a very long time, which leads to multiple vendor setups.

Product Challenges

  • Business needs were not fulfilled thoroughly; hence significant P1's; was directly impacting the client's business. Test cases that were created from a business perspective contain various flaws. Such as:
    • Test cases were not as per business use case.
    • The complete business flow was not covered.
    • Cases related to OS and browser compatibility were not catered.
  • Unviable test suites-
    • Test suites were not structured.
    • Duplicate sets of test cases.
    • Bunch of test cases for which the functionality from the system was obsolete, but the test cases still reside in the test suites.
    • Test cases were not shared with the Dev team for review.
    • Test data was not integrated/ up to the mark.
  • Earlier Performance and load tests were not performed. Previously the system was not validated for every corner. The system was never certified by putting user load, which results in the software crashes on production.
  • Bugs were slipped, and eventually, business folks started reporting functional flaws in the product after every release.
  • No explicit process for the crystal-clear visibility of the execution process. Lack of Transparency in data (Planning, execution, and resources) .
  • The automation growth path was blurry, which led to more manual efforts. Hence, it opens the scope of errors.
    • Lack of traceability/mapping of test scripts with test cases.
    •  
    • No coverage matrices were published, due to which there was no clear visibility to the client about the progress.
    •  
    • LNo automation execution artifacts were published.
    •  
    • LNo integration between test management tools and automation scripts.
    •  
    • LVery little area of application was covered through automation.
    •  

Our Offerings For The Case: Retail Software Testing Web Testing Services

triangle

Solutions Provided

  • Identifying and creating extensive test cases:
    • Cover all the possible use cases from the end-user perspective during test case creation.
    • We have covered the complete business flow in our test cases, and to ensure this, the requirement traceability matrix was maintained.
    • Test cases for OS and browser compatibility were written and tested.
    • Test coverage for the error-prone area was increased, and test cases of these modules were executed first.
    • Out of box testing to cover all impacted areas will be done. Our P1 will get stronger with this.
  • Structuring and organizing Test Suites
    • Remove duplicity of test cases by structuring them at core-based (common features used in all programs) and feature-based at the program level. This will avoid unnecessary copy-paste of the test cases.
    • Test data integrated with test cases.
    • Timely creation of test cases and delivery for review.
    • Test cases are reviewed and updated on a daily basis by the Lead and team, and cases related to the functionalities which are obsolete from the system are moved to the archived folder in the test suites.
    • Introduced test case creation on changed features and automated those test cases in parallel
  • We introduced a process under which no deployments will be done without performance testing sign-off. For performance testing, we provided an infrastructure of load test scripts, and nowadays, every release document has detailed threshold stats corresponding to these areas.
  • During our trial phase, we observed a lack of domain knowledge among previous resources, due to which bugs were slipping out of QA hands. We organized the whole training structure and introduced training via video tutorials.
  • Transparent Execution Process
    • Full Transparency in each aspect (Planning, Execution, Resources, Etc.)
    • Clear Status or picture about the ticket/ Sprint and escalate any impediments to management in a timely manner. Test cases for OS and browser compatibility were written and tested.
    • Initialized and maintained release metrics-Collecting and analyzing quality data to predict and control the quality of the software product being developed by creating Test Metrics. This will be gathered over time to see the trend.
  • Load and Performance Tests included by us:
    • We have mapped each test case with the corresponding automated scenario.
    • Coverage matrices are published in each release which gives clear visibility to the client about the progress.
    • Automation execution artifacts are published after each regression.
    • We integrated the automation script with a test case management tool via API for automated marking.
    • We have automated most of the application area already and continuous test case automation parallelly with the development.

Tools & Technologies We Used

  • Automation IDE: IntelliJ Version Control: GitHub Performance: JMeter, Flood IO
  • Serenity BDD, Rest Assured
  • Test Rail
  • JiRA
  • Postman
  • Spring Modules
  • JAVA, JAVA 8
  • Postgres SQL
  • Hibernate (Used for DB connections)
  • JSP
  • Restful services
  • Micro services-based architecture
  • AMQP (RabbitMQ)
  • Caching (Redis)
  • OS workflow
  • Decision Engine
  • CSS framework Twitter Bootstrap

Client Benefits

  • No show-stoppers in production after our take-over
  • As we believe in automation, so it reduces the time of execution, increases the quality of QA, which ultimately leads to the client's benefit in terms of quality and money.
  • Team BugRaptors worked like a product partner with the client to ensure the quality of the product and increase the quality expectations of the client.
  • Performed testing on various platforms made the client assured of the application's robust Performance.
  • With the automated training pattern, the manual effort was saved, and anyone could recall the domain at any time.
  • Crystal clear visibility was achieved for the client on the planning and execution of every aspect of the application.

Do You Feel The Same Quality Challenges in Your Product?

More Case Studies

case-study

Reassuring the Quality and Security of Web-Mobile based Point of Sales System

Our client is a UK based leading software service provider with powerful web-based POS. To compete in the global market,...

Know more +
case-study

Assuring Overall Quality and Reliability of i-Look Website Through Regression Testing

A renowned UK based client hired BugRaptors for testing on his job searching and hiring software that creates a friendly yet professional environment...

Know more +
case-study

Automation Based Testing of Web Application and Ensuring Adequate Performance

Automation testing for a leading client using Selenium, Appium and Test NG of a Disaster management application that mitigate risks, improve emergency...

Know more +

Our

blogs

Read our perspectives on the latest trends and techniques in software quality assurance.

arrow