BACK_TO_ARCHIVE
Gaming/Online Gaming Platform

Helping An Online Gaming Company Migrate From Postgres To ClickHouse Cloud

Building a real-time data warehouse for a sports betting and gaming platform processing ~300 million events per day, scaling to trillions of rows.

The Challenge

Our client are a provider of software technology to the sports betting and gaming industries. They provide a white-labelled platform to a number of global partners who in turn offer entertainment products to end consumers.

Historically, the platform has used Postgres for analytics. However, as is often the case, their application has begun to slow down as they have captured more data and are carrying out more sophisticated analytics over that data.

Our client have a big data workload, consisting of ~300 million events per day which should be analysed in real time. They decided to use ClickHouse for its high performance when ingesting and analysing large data sets, and chose to use ClickHouse Cloud for fast time to market and low operational overhead.

Customer Pain Points

The customer were experiencing the following challenges prior to the engagement:

  • Postgres was slowing down under the load of high amounts of data and increasingly complex queries.
  • Their application user experience was being impacted as data volumes grow.
  • They wanted to provide more real-time analytics to their clients and partners over time as opposed to end of day analytics.
  • They wanted to keep load off of their transactional database to maximise performance.
  • They were lacking skills and experience with ClickHouse and ClickHouse Cloud.

Our Technical Approach

We took the following approach to this project:

  • Worked with the client to design and build a solution based on ClickHouse Cloud.
  • It makes use of ClickPipes for Postgres CDC in order to integrate large data volumes from their Postgres database.
  • When the data hits ClickHouse, it has to be broken into materialized views which pre-calculate the results of common queries. This was challenging because they have many billions of rows which are captured as people interact with their games.
  • The system includes financial transaction metrics which must be accurate and retained for regulatory and compliance purposes.
  • Introduced a single tenant solution where data is sharded into separate partner specific databases from a multi-tenant source dataset.

Outcomes

Key outcomes of the project included:

  • Implemented a real time data warehouse based on ClickHouse Cloud and ClickPipes.
  • Delivered a solution that ingests upwards of ~300 million events per day and will eventually scale to trillions of rows.
  • Implemented a solution to deliver partners access to their specific data in a secure manner from a multi-tenant Postgres database.
  • Delivered an end to end solution including dashboards hosted in Apache Superset.
  • Upskilled customer engineers on ClickHouse operations, schema design and usage.
CASE_ID: gaming-postgres-migrationRETURN_TO_INDEX