Free Consultation WhatsApp Us

Case Study

·

2021

GoLive Asia

Sustaining 30,000 Requests Per Second for Live Event Ticketing

A mobile-first event discovery and ticket booking platform built for extreme traffic conditions — featuring multi-layer CDN caching, QueueIT virtual waiting room, distributed cache with stampede protection, and auto-scaling cloud infrastructure across GCP and AWS.

Mobile AppEvent TicketingHigh TrafficCloud Infrastructure

30K+

Requests Per Second

1,000+

Events Listed

2

Mobile Platforms

90%

CDN Cache Hit Rate

GoLive Ticketing Platform

The Challenge

High-stakes ticketing under extreme traffic conditions

GoLive Asia needed to transition from third-party ticketing platforms to their own purpose-built solution. The existing approach meant per-ticket variable costs with no customer data retention, making it impossible to build a loyal user base or control the booking experience. However, building an in-house platform introduced a far greater technical challenge: handling the extreme traffic spikes that occur when popular event tickets go on sale — where tens of thousands of users attempt to purchase simultaneously within a 2-minute window, often overwhelming even well-provisioned infrastructure.

1

No Own Ticketing Platform

Reliance on third-party ticketing platforms meant per-ticket variable costs, no customer data ownership, and no control over the booking experience or brand presence.

2

Flash Traffic Spikes

Popular event tickets go on sale at a fixed time — generating tens of thousands of concurrent requests as users attempt to purchase simultaneously within a 2-minute window.

3

Limited Infrastructure Budget

The platform needed to sustain massive traffic spikes without over-provisioning — maximising existing cloud resources while keeping infrastructure costs viable for a growing business.

The Solution

Multi-layer infrastructure built for 30K requests per second

Advisory Apps engineered a multi-layer defence architecture spanning two cloud providers: AWS CloudFront for edge caching, WAF protection, and DDoS mitigation at the perimeter, with GCP App Engine providing auto-scaling application instances backed by Redis cache and MySQL with dynamic read replicas. During high-demand events, QueueIT virtual waiting room controls user flow at the entry point — releasing users in managed batches to prevent backend overload. A custom distributed lock mechanism at the Redis layer eliminates cache stampede, ensuring only one request queries the database while thousands of concurrent requests wait for the cached result.

Multi-Layer CDN & WAF Shield

AWS CloudFront CDN with WAF rules providing DDoS protection, bot blocking, OWASP filtering, and aggressive edge caching — serving over 90% of traffic without reaching the origin server.

Virtual Queue & Controlled Access

QueueIT virtual waiting room deployed during high-demand events to regulate user flow into the transactional system — preventing server overload by controlling concurrency at the entry point.

Distributed Cache with Stampede Protection

Redis cache layer with distributed lock mechanism preventing cache stampede — only one request queries the database while concurrent requests wait for the cached result.

Auto-Scaling Event Infrastructure

GCP App Engine scaling from 1 to 600 instances with MySQL read replicas expanding from 2 to 80 — pre-scaled using historical traffic data before each major event sale.

System Architecture

The platform spans two cloud providers — AWS for edge security and caching, GCP for application logic and data. During high-demand events, QueueIT regulates user flow before requests reach the origin infrastructure.

Web App

Mobile-Responsive

Mobile App

iOS & Android

HTTPS

AWS CloudFront + WAF

DDoS Protection • Bot Blocking • OWASP Rules • Rate Limiting

90%+ traffic served here Edge Caching
10% TO ORIGIN

QueueIT Virtual Waiting Room

Controlled user release • Concurrency management • Fair queuing

Active during high-demand events
CONTROLLED RELEASE

GCP App Engine (Auto-Scaling)

Main API (1-600) • Shop API (1-5) • Admin (1-2) • Web App (1-40)

READ / WRITE

Redis Cache

1-11GB • Distributed Locks

Cloud SQL MySQL

Master + 2-80 Read Replicas

2C2P Payment

FPX • Cards • Banking

Firebase

Notifications • Analytics

Cloud Storage

Media • Assets

CDN / Database layer
Conditional (during events)
Application layer
Cache layer

Traffic Pattern During Ticket Sales

When tickets go on sale at a fixed time, traffic spikes dramatically within a 2-minute window. Users begin arriving minutes before the sale opens, creating a sharp surge that peaks at the exact sale time and tapers off shortly after.

9:55
9:56
9:57
9:58
9:59
30K req/s
10:00
10:01
10:02
10:03
10:04
10:05

Typical traffic distribution during a high-demand ticket sale event

Cache Stampede Protection

When cache expires during peak traffic, hundreds of simultaneous requests can flood the database — causing timeouts and cascading failures. The distributed lock mechanism ensures only one request rebuilds the cache while all others wait.

Before — Cache Stampede

1,000+ Concurrent Requests

Database

All requests query DB simultaneously

DB Timeout • 2.4s queries • Metadata locks

After — Distributed Lock

1,000+ Concurrent Requests

Redis Lock Gateway

1 request acquires lock • others wait

Database (1 query)

Result cached • all others served from cache

Sub-100ms response • Zero DB overload

Request Lifecycle During Event Sales

How a ticket purchase request flows through the multi-layer infrastructure — from the user's device through CDN edge cache, virtual queue, WAF protection, application cache, and finally the database.

Step 1

User Opens App

Browses events or opens ticket sale page

Step 2

CDN Edge Cache

90%+ of read requests served from CloudFront edge

Step 3

QueueIT

Users queued and released in controlled batches

Step 4

Redis Cache

Check Redis first; if miss, acquire distributed lock

Step 5

Database Query

Only 1 locked request queries MySQL; result cached for all

Traffic Distribution

CDN
90%
10%
Origin

Over 90% of all traffic is served from CloudFront edge cache

Implementation Timeline

Phase 1

Platform Foundation

3 months

Phase 2

Payment & Ticketing

3 months

Phase 3

Traffic Optimisation

4 months

Phase 4

Scale & Hardening

3 months

Methodology

Performance-first engineering with iterative hardening

The project followed an iterative cycle of optimise, load test, analyse, and optimise again — working closely with the client's team and infrastructure partners to progressively harden the platform for increasingly demanding traffic scenarios. Each phase delivered a working increment while building toward the full-scale capacity required for major event launches.

01

Discovery & Architecture

Traffic pattern analysis, cloud provider evaluation, CDN vs origin cost modelling, and infrastructure capacity planning for peak event scenarios.

02

Core Platform Build

Event management CMS, ticketing engine, QR validation crew app, and payment gateway integration with 2C2P across multiple payment methods.

03

Performance Engineering

Multi-layer caching implementation, cache stampede fix with distributed Redis locks, QueueIT integration, WAF rule tuning, and feature flag system.

04

Load Testing & Launch

Simulated peak traffic spikes, auto-scaling verification across App Engine and database replicas, pre-scaling runbook creation, and dedicated monitoring setup.

Key Features Delivered

Event Discovery & Booking

Browse and search events by location, date, and category with real-time seat availability, instant booking confirmation, and digital ticket delivery.

QR Code Ticket Validation

Dedicated crew app for scanning QR tickets at venue entry points — instant validation against the booking database with offline fallback capability.

2C2P Payment Integration

Multi-method payment gateway supporting bank transfer (FPX), credit cards, and online banking with automated reconciliation and refund processing.

CloudFront CDN with WAF

AWS CloudFront serving over 90% of all traffic from edge cache with WAF rules blocking bots, DDoS attacks, and rate-limiting abusive IP ranges.

QueueIT Virtual Waiting Room

Virtual queue system activated during high-demand sales to control user concurrency — releasing users in managed batches to protect backend infrastructure.

Feature Flag Event Mode

Admin-controlled toggle to reduce non-essential operations during peak ticket sales — preserving origin capacity for critical transactional workflows.

The Results

Measurable performance under extreme load

30K+

Sustained Requests Per Second

Platform handles 30,000 concurrent requests per second during peak ticket sales without degradation.

1.8M

Hits Per Minute Capacity

Infrastructure capacity tested and sustained at 1.8 million hits per minute during live event sales.

90%

Traffic Served from CDN

Aggressive CDN caching ensures only 10% of traffic reaches the origin server — reducing infrastructure costs and latency.

0

Downtime During Optimised Events

Zero downtime achieved during event sales after implementing multi-layer caching, QueueIT, and auto-scaling.

Conclusion

High-performance ticketing, delivered at scale

The GoLive Asia platform demonstrates what's achievable when infrastructure engineering is treated as a first-class concern from day one. By combining AWS CloudFront's edge caching and WAF protection with GCP's auto-scaling App Engine, Redis distributed locking, and QueueIT virtual queuing, we built a ticketing platform that sustains 30,000 requests per second during peak sales — with zero downtime and over 90% of traffic served from CDN edge cache.

The iterative approach of optimise, load test, and harden proved essential — each major event provided real-world data that informed the next round of infrastructure improvements. Working collaboratively with the client's team and infrastructure partners, the platform evolved from handling modest traffic to sustaining 1.8 million hits per minute under extreme flash-sale conditions.

Future Outlook

  • Dynamic pricing and demand-based ticket allocation for event organisers
  • Expanded analytics dashboard providing real-time sales insights and audience demographics
  • Multi-region CDN expansion for broader Southeast Asian market coverage

Want similar results for your business?

Let's discuss how we can build a custom solution tailored to your needs.

Get a Free Consultation

Need help? Chat with us on WhatsApp for instant support!