Rahul Upadhyay

Rahul Upadhyay

Open to opportunities

Senior Backend Engineer | Distributed Systems & Event-Driven Architecture

Berlin, Germany

15+ years building backend systems that scale. Currently at Gymondo (7NXT Group), architecting event-driven microservices on AWS — from real-time ETL pipelines processing ~80k events/hour to serverless API gateways serving Web, iOS, and Android.

About Me

What I build

Event-driven microservices and serverless backends in Go on AWS. I specialise in high-throughput data pipelines, API architecture, and platform migrations at scale — currently at Gymondo (7NXT Group), Berlin.

At what scale

~80k events/hour sustained pipeline throughput. 2.8M+ customer records migrated without downtime. 100K+ users onboarded from an acquired platform. 12 client API calls unified into a single gateway for Web, iOS, and Android.

Currently exploring

AI Agents with Genai, go ADK, Autonomous Agents, Content recommendation system using RAG and LLMs, Distributed tracing with OpenTelemetry. Multi-region consistency patterns in event-driven systems.

Tech Skills

Golang

Golang

AWS

AWS

Lambda

Lambda

DynamoDB

DynamoDB

Redis

Redis

ElasticSearch

ElasticSearch

PostgreSQL

PostgreSQL

MySQL

MySQL

MongoDB

MongoDB

Docker

Docker

K8s

K8s

Terraform

Terraform

New Relic

New Relic

GitHub Actions

GitHub Actions

Personalize

Personalize

Typescript

Typescript

Projects

Short summary of work showing problem & impact.

Video Infrastructure Migration: Mainstreaming to Brightcove

I architected and built the end-to-end migration pipeline for international video accounts, transitioning from Mainstreaming to Brightcove. Designed automated pipelines to transfer legacy video entities and synchronize metadata across the CDN boundary. Upgraded core backend services and data pipelines to natively support Brightcove as the primary CDN provider — zero service downtime during transition.

GoAWSSNSSQSData MigrationCDNBrightcoveBackend Engineering

Wearable Data Integration & Gamification Engine

I designed and built the platform's first wearable health data integration, ingesting real-time step counts and metrics from Spike & Thryve into the AWS-based event pipeline. Independently developed the Steps-as-a-Goal feature from API contract to production — physical activity milestones directly trigger rewards in the contest engine. Built to handle burst ingest from irregular wearable sync schedules without backpressure on downstream services.

GoAWS LambdaDynamoDBSNSSQSEvent-DrivenData PipelinesGamification

Advanced CDN Transition & Metadata Enrichment

I led the global video migration from Akamai to Brightcove, extending the platform's event-driven architecture to propagate Brightcove CDN state changes (publish, update, delete) in real time across the service mesh via AWS SNS & SQS. Designed a scalable annotation capability to attach structured metadata to video resources, enabling interactive overlay features in client-side video players.

GoAWSSNSSQSData MigrationCDNBrightcoveEvent-DrivenArchitecture

Real-time ETL Pipeline & CRM Migration

I built a high-throughput ETL pipeline in Go processing ~80k events/hour, migrating 2.8M+ customer records from a legacy CRM stack (Emarsys) to Braze. Replaced fragile nightly cron jobs with a real-time event-driven pipeline, cutting data latency from 24 hours to real time. Led the full migration including users, subscriptions, and historical data — zero downtime.

GoETLEvent-DrivenBrazeAWS LambdaDynamoDBMySQLMongoDB

Event-driven Data Ingestion & Search System

I designed the event-sourced architecture to capture high-throughput data changes with strong consistency. Built scalable REST APIs backed by Elasticsearch for low-latency full-text search, directly improving user experience for content discovery.

Event SourcingElasticsearchKotlinJavaREST APIMicroservicesArchitecture

Content Recommendation Service

I built the platform's personalized recommendation engine using AWS Personalize, including custom filter APIs and a Redis caching layer. Optimized response times through data restructuring and caching strategies, improving content discovery and user engagement metrics.

GoAWS PersonalizeRedisREST APIMicroservicesCachingServerless

Platform Data Migration & Multi-Locale Support

I led the migration of 100K+ users, content items, subscriptions, and historical records from an acquired platform to the current system — zero data loss, minimal downtime. Implemented multi-locale backend support to ensure a seamless experience for global users across markets.

GoData MigrationMulti-localeMySQLData EngineeringArchitecture

AI-powered Healthy Meal Scanner

I designed and built the platform's first AI-powered food analysis feature — a fully serverless Go pipeline on AWS Lambda, DynamoDB, and S3 that identifies food from images and returns personalized nutritional guidance. Applied prompt engineering and context design to improve prediction accuracy for diverse meal types.

GoAWS LambdaDynamoDBS3AIServerlessPrompt Engineering

Unified API Gateway

I consolidated 12 downstream service calls into a single client endpoint using AWS Lambda and Go goroutines for concurrent request aggregation. Eliminated multi-version API maintenance, reduced client-side complexity, and improved performance for Web, iOS, and Android by standardizing DTO contracts across services.

GoAWS LambdaServerlessAPI GatewayGoroutinesSystem Design

DTO Consistency Framework

I established standardized DTO contracts across 5+ services, enforcing modular, row-based API responses. Decoupled client DTOs from backend domain models, eliminating the need for multi-version API maintenance (v1/v2/v3) and reducing client integration time across teams.

GoAPI ContractsArchitectureMicroservicesSystem Design

Case studies

Descriptions of work showing problem, approach, and impact.

    The legacy system that connected customer data to the CRM (Emarsys) relied on multiple brittle handlers and nightly cron jobs. This caused scalability issues, delayed data availability, and hacky workarounds for real-time use cases. I conducted a deep audit of CRM use cases, designed a strategy to migrate operations to Braze (a modern engagement platform), and architected a real-time event-driven ETL pipeline in Go. This system supported sustained throughput of ~80k events/hour, migrated over 2.8M customer records, and replaced the entire legacy stack. The migration not only improved reliability and data freshness but also freed the CRM team from operational overhead. Key lessons: thoughtful event modeling, resilient consumer design, and investing in operational tooling are as critical as writing robust code.

    Impact:

    Reduced data latency from 24 hours to real time by replacing fragile nightly crons with a high-throughput event-driven pipeline. Sunset multiple legacy CRM handlers, streamlined CRM workflows, and enabled faster, more personalized customer engagement—directly improving win-back and repurchase metrics.

    Real-time ETL Pipeline & Legacy CRM Migration

    Real-time ETL Pipeline & Legacy CRM Migration

    ETLData EngineeringGolangEvent-DrivenCRMBrazeMigrationScalingArchitectureSystem DesignMysqlMongoDBPostgreDynamoDB

    Previously, client applications (Web, Android, iOS) made direct calls to multiple backend services, each with inconsistent DTO structures. This caused bloated client logic, fragile features, difficult debugging, and slow development velocity. Backend services were equally burdened—changes often required maintaining multiple API versions (v1, v2, v3), creating a maintenance nightmare. I partnered with product and client guild engineers to define a unified DTO contract, ensuring modular, row-based responses across services. I introduced a gateway layer using AWS Lambdas to handle all client data requests, aggregating responses from multiple backend services into standardized DTOs. By leveraging Go routines for concurrent processing and aggregation, the system delivered lower latency and reduced client-side overhead. This approach decoupled backend services from client DTOs, allowing services to focus solely on domain responsibilities while the gateway handled orchestration and data shaping. The result: simplified client integration, reusable gateway packages for page/screen-specific needs, and a significant improvement in scalability, reliability, and developer productivity.

    Impact:

    Standardized client-server contracts and unified multiple client API calls into a single gateway layer. Reduced client complexity, eliminated inconsistent implementations across platforms, and improved backend maintainability—accelerating feature delivery across Web, iOS, and Android.

    Unified API Gateway & DTO Consistency Framework

    GolangArchitectureConsistencyContractsMicroservicesAWS LambdaAPI GatewayServerless

    The platform had no mechanism to consume real-time health data from wearable devices, limiting gamification to in-app activity only. I designed and built a high-throughput data pipeline to ingest step-count and health metrics from Spike's and Thryve's API, normalize the payload, and propagate events through the existing service mesh via AWS SNS & SQS. I used Go and AWS Lambda for scalable, stateless processing and MongoDB for high-speed writes. I then designed the contest domain model for the first Steps-as-a-Goal feature — where physical activity milestones triggered reward events in the rewards engine. The key design challenge was handling variable ingest rates: wearables sync on irregular schedules and can batch-deliver hours of data at once. I built the pipeline to handle burst ingestion without backpressure on downstream services, using idempotent processing to safely handle duplicate events. The gamification logic was cleanly decoupled from the data ingestion layer so future health metrics (calories, heart rate, sleep) could be integrated without modifying business logic. Shipped with zero platform downtime and fully integrated into the existing CI/CD pipeline.

    Impact:

    Brought real-time health data into the platform for the first time, unlocking a new class of gamification features. The Steps-as-a-Goal contest feature directly tied physical activity to rewards, driving measurable increases in user engagement and daily active usage.

    GoAWS LambdaMongoDBSNSSQSEvent-DrivenData PipelinesGamificationAPI IntegrationServerless

    The platform's video delivery relied on two legacy CDN providers across different international accounts, causing fragmented tooling, divergent metadata schemas, and no path to advanced video features. I was responsible for both phases of the migration. In Phase 1, I architected an automated migration pipeline to transfer legacy video entities from Mainstreaming to Brightcove, synchronize metadata, and upgrade core backend services and data pipelines to treat Brightcove as the primary CDN. In Phase 2, I migrated the global Akamai-served video library using the same pipeline pattern, adapted for Akamai's entity model. I also extended the platform's event-driven architecture to react to Brightcove lifecycle events (publish, update, delete) in real time, propagating CDN state changes across the service mesh via AWS SNS & SQS. This replaced a polling-based approach with event-sourced accuracy. Alongside the migration, I designed a scalable annotation system to attach structured metadata to individual video resources — enabling client applications to render interactive overlays and chapter markers based on server-side metadata. All migrations ran in production with idempotent pipelines and reconciliation jobs to detect and resolve drift.

    Impact:

    Migrated the full international video library across two CDN transitions (Mainstreaming → Brightcove, Akamai → Brightcove) with zero service downtime. Unified video delivery infrastructure, eliminated legacy operational overhead, and unlocked richer interactive features for client-side video players via a new annotation capability.

    GoAWSData MigrationCDNBrightcoveEvent-DrivenArchitectureMicroservices

Certifications

Architecting Solutions on AWS

Issued By: Amazon Web Services

Coursera

Programming with Google Go Specialization

Issued By: University of California, Irvine

Coursera

Concurrency in Go

Issued By: University of California, Irvine

Coursera

Open Source Projects

Simple Ava HTML Reporter

Simple Ava HTML Reporter is a reporting module for Ava to parse the JSON output to a beautiful report.

npmavahtmlreportertdd

Protractor simple cucumber html reporter plugin

This plugin will connect Protractor, CucumberJS and protractor-cucumber-framework to generate unique JSON files per feature with only a few lines of code.

npmprotractorhtmlplugincucumber

Simple Cucumber HTML Reporter

Simple Cucumber HTML Reporter is a reporting module for Cucumber to parse the JSON output to a beautiful report.

npmhtmlcucumberreporterbdd

Resume & Experience

Download printable PDF

  • Architected and built a real-time ETL pipeline in Go processing ~80k events/hour, migrating 2.8M+ customer records from a legacy CRM (Emarsys) to Braze — cutting data latency from 24 hours to real time.
  • Designed a serverless API gateway using AWS Lambda and Go goroutines, consolidating 12 downstream service calls into a single endpoint for Web, iOS, and Android clients.
  • Built the platform's first wearable health data integration (Spike & Thryve), designed the Steps-as-a-Goal gamification feature, and wired physical activity milestones to the rewards engine.
  • Led two CDN migrations (Mainstreaming → Brightcove, Akamai → Brightcove) with automated migration pipelines and event-driven metadata sync — zero service downtime across both transitions.
  • Designed and shipped the platform's first AI-powered meal scanner — a serverless Go service on AWS Lambda, DynamoDB, and S3, using prompt engineering to identify food and return personalized nutritional guidance.
  • Standardized DTO contracts and API response patterns across 5+ services, eliminating multi-version API maintenance overhead and reducing client integration time.
  • Built and maintained CI/CD pipelines with GitHub Actions; established structured logging and alerting to reduce mean time to detect production incidents.
GoAWS LambdaDynamoDBMongoDBPostgreSQLMySQLSNSSQSBrazeGitHub ActionsRedisElasticsearch
  • Led a team of 11 engineers building mobile platform infrastructure and backend API performance benchmarking for Deutsche Telekom's OneApp.
  • Designed backend API performance benchmarking frameworks, establishing performance baselines and identifying scalability bottlenecks across backend microservices.
  • Built iOS, Android, and backend API automation frameworks integrated into CI/CD pipelines, enabling continuous quality validation across the full platform stack.
  • Coordinated engineering delivery across Android, iOS, and backend microservices teams within an enterprise Agile SDLC, ensuring predictable release milestones.
  • Integrated regression and performance test suites into CI/CD pipelines, enabling continuous validation and faster developer feedback loops.
PythonJavaScriptREST APIsiOSAndroidCI/CDPerformance Engineering
  • Defined and executed automation strategy for 30+ microservices, 7 Angular portals, and a native Android app, improving release velocity and reliability.
  • Developed scalable client (web + Android) and backend API automation suites, reducing manual effort and ensuring faster feedback cycles.
  • Designed and implemented performance benchmarking frameworks for backend APIs, driving scalability and reducing downtime.
  • Streamlined delivery by integrating automation and performance tests into CI/CD pipelines, accelerating deployment and feedback loops.
  • Led and mentored a team of 15 engineers, fostering innovation, collaboration, and continuous skill growth.
JavaScriptAndroidREST APIsCI/CDSeleniumAppium
  • Re-architected 2,200+ UI test cases into API-level automation, cutting execution time by 15% and eliminating a class of flaky, environment-dependent failures.
  • Owned the engineering infrastructure for Data Integration and Platform products; integrated automation directly into CI/CD pipelines to deliver same-PR feedback to development teams.
JavaScriptREST APIsCI/CD
  • Designed and built automation frameworks for 5 enterprise-grade integration connectors, owning end-to-end release engineering across 30+ production releases.
  • Embedded connector quality gates directly into CI/CD pipelines, enabling continuous validation and cutting post-release defect rates.
  • Defined and owned the engineering roadmap for connector reliability, establishing structured delivery processes and predictable release timelines.
JavaScriptREST APIsCI/CD
  • Designed automation frameworks for Web & Mobile platforms and implemented performance benchmarking scripts.
  • Integrated automated regression tests into CI/CD pipelines, accelerating delivery and enabling continuous quality checks.
SeleniumAppiumCI/CD
  • Infosys (2009–2014): Authored and executed test plans for mainframe applications; validated backend databases (IBM DB2, MS SQL) for functional accuracy and data integrity.
  • Bhavna Corp (2014): Developed an automation suite for SRS EHR; built a VBA reporting tool to replace manual biometric data processing.
IBM DB2MS SQLHP Quality CenterVBASQL

Get in touch

I'm available for full-time roles. Email: rahul841986@gmail.com

© 2026 Rahul Upadhyay — Built with Next.js & Tailwind CSS, deployed on Cloudflare Pages