DeviQA case study: Betting
DeviQA Logo
  1. Home
  2. /
  3. Case studies /

Betting application

Identifying performance bottlenecks in a betting application through comprehensive UI performance testing.

>1k

Test cases developed

>5k

Threads for performance

>200

Machines used

>50

Critical/major issues identified

About project

As we undertook the project, our primary objective was to ensure the high performance of a web-based betting application. With a multitude of functions and features, the application serves as a diverse and immersive platform for online betting activities. From intuitive interfaces to intricate backend capabilities, it provides a seamless and enjoyable experience for users exploring the world of online betting.

Before DeviQA

  • There were numerous challenges with the real-time update of bets

  • The app contained numerous performance issues

  • The user experience was poor

  • The maximum number of concurrent online users was not defined

With DeviQA

  • A test suite was created that could generate test data based on the number of browsers in the Kubernetes cluster

  • >350+ scripts were developed

  • >30 UI/UX issues were reported

  • An exact maximum number of concurrent online users was identified

  • >5,000 threads were used for performance testing

  • >20 critical performance issues were identified

  • A testing farm of 200+ machines was developed to execute tests in parallel

  • Major bottlenecks were found on the frontend, backend, and GraphQL server

  • A solution was developed to avoid a login bottleneck when checking that more than 10,000 concurrent users could work with the application

  • A scalable automated test suite was developed to meet the requirements about the number of online users

Our contribution

Team

2 senior automation qa engineers

Project length

Since 2018

Technologies and tools

Kubernetes

Zalenium

Selenide

Java

JUnit

Allure

AWS

EKS

Docker

TestNG

Gatling

Grafana

Sauce Labs

Jenkins

TestRail

Our engagement

Our automation QA engineers joined the project to help an in-house team execute performance testing.

Traditional performance tools like JMeter and NeoLoader were deemed insufficient in this case because we had to do performance testing for the UI part rather than the backend. As a result, our choice fell on Selenium. Also, we used Zalenium as a Selenium grid in Kubernetes, which enabled the orchestration and rapid scaling of Selenium Nodes.

On studying project requirements and needs, we focused on the emulation of 10,000 ‘real users’. To accommodate this substantial load, we leveraged AWS and implemented a Kubernetes cluster with 200 c5d.18xlarge nodes, each boasting 72 cores and 144 GB of RAM. This robust infrastructure facilitated parallel test execution on a grand scale.

We developed a test suite that dynamically generated test data based on the number of browsers in the Kubernetes cluster, providing a flexible and scalable testing framework. A dedicated test farm orchestrated the parallel test execution across 200 machines.

During testing, we encountered challenges related to the real-time update of bets. To overcome these hurdles, we introduced intricate logic with various conditions. Besides, we also incorporated additional timeouts to navigate the DDoS attack protection, as running tests on the production environment was one of the client’s requirements.

In the course of performance testing, we identified critical bottlenecks across the frontend, backend, and GraphQL server. Additionally, despite the client's expectation of supporting a load of 5000+ users, our initial test run revealed that the platform could sustain a maximum of 717 concurrent users without downtime.

Utilizing advanced tools and technologies within a Kubernetes infrastructure, we provided valuable insights into the performance and resilience of the high-stakes betting application, enabling the client to implement corresponding improvements and ensure smooth business growth.

Services provided

Performance testing

Testing this betting application for performance, we focused on simulating real-world scenarios with 10,000 ‘real users’. Leveraging the dynamic combination of Selenide, Zalenium, and Kubernetes, we orchestrated a robust testing environment to scrutinize critical user actions. The results of our performance testing unveiled challenges related to real-time bet updates and system scalability, prompting strategic interventions. By harnessing the power of Kubernetes, AWS, and advanced testing frameworks, we meticulously examined application resilience and scalability, ensuring its ability to seamlessly handle the anticipated user load and deliver an optimal user experience under diverse conditions.

Dedicated qa team

Our two Senior Automation QA Engineers seamlessly and quickly onboarded the project to take charge of performance testing. Throughout our collaboration, they closely worked with the in-house development and QA teams, sharing knowledge and providing quick feedback. Insights from performance testing empowered the development team to swiftly and efficiently address performance bottlenecks, while the QA team gained new knowledge and skills in the realm of performance testing.

Facing similar challenges?

Schedule a call to see how we can help you

Contact us

More projects

Test coverage:

90%

  • Web app testing
  • Test automation
  • API testing
  • Dedicated QA team

Abbott Laboratories

A global healthcare giant relies on DeviQA for end-to-end test automation.

  • 1 day to run regression testing

  • 1 day to run smoke testing

  • 60% increase in the number of regression tests and relevant test cases

  • >1500 automated test scenarios created

Read more

Dev capacity:

+15%

  • Web app testing
  • Automation testing
  • E2E testing
  • Load testing
  • Mobile testing
  • API testing
  • Dedicated QA team

Compass

DeviQA's team takes full responsibility for testing a real estate solution consisting of a web app and mobile apps.

  • 2 days to run regression testing

  • 10 minutes to run automated E2E testing on each pull request

  • 85% of the application is covered with test cases

Read more

Test coverage:

95%

  • Web app testing
  • API testing
  • Dedicated QA team
  • Mobile testing
  • Performance testing

Arklign

Setting up and fine-tuning a holistic QA process for a dental lab management platform.

  • A comprehensive test plan designed

  • 5000+ test cases created for the web app

  • 2000+ test cases created for the mobile apps

  • >20 various devices with different OS, screen resolutions, and browsers used to run regression testing

Read more