Bharath Surampudi

Data Engineer | AWS Solutions | Java & Python Expert
📍 Sydney, Australia | bharathsurampudi@gmail.com | 📞 0410 638 861
LinkedIn | GitHub
Download PDF Version

Professional Summary

Software Development Engineer II (Data) at Mastercard with specialized expertise in building high-throughput, regulatory-compliant data platforms. I combine Software Engineering rigor (Java, CI/CD, Contract Testing) with Modern Data Architecture (AWS, Spark, Airflow) to deliver reliable financial data systems. Proven track record of migrating legacy ETL to cloud-native serverless architectures and engineering observability solutions that reduce MTTR in production payment flows. AWS Certified Data Engineer (DEA-C01).

Technical Skills

Cloud & Platforms AWS (S3, Glue, Redshift, EMR, Lambda), Terraform.
Data Engineering Apache NiFi, Airflow, dbt, Spark, Kafka.
Programming Python (Pandas, PySpark), Java (Spring Boot), SQL.
DevOps & Tools Docker, Git, Jenkins, Splunk, CI/CD.

Professional Experience

Software Engineer II (Data Engineering & Payments)
Mastercard, Sydney Nov 2021 – Present
  • High-Volume Data Orchestration: Engineered enterprise-grade ingestion pipelines using Apache NiFi and Java to route sensitive payment data across global regions. Optimized flow configurations to ensure 99.9% availability under strict SLA constraints.
  • Data Observability & Incident Reduction: Developed a custom observability suite using Splunk to parse granular NiFi provenance logs. Resulted in a 40% reduction in MTTR for data transfer bottlenecks and proactive detection of silent failures.
  • Schema Enforcement & Reliability: Implemented Consumer-Driven Contract Testing (Spring Cloud Contract) for 5+ microservices. Prevented breaking schema changes in production data streams, a critical requirement for financial reporting reliability.
  • API Data Serving Layer: Built robust Spring Boot APIs to expose aggregated transaction data to downstream analytics platforms, replacing fragile file-based transfers with secure, distinct REST endpoints.
  • Legacy Modernization: Partnered with Principal Architects to decouple monolithic ETL jobs, migrating logic to cloud-native patterns and improving data throughput for regulatory reporting.
Software Engineer
Neau Collective, Sydney Mar 2021 – Nov 2021
  • Automated Data Ingestion: Developed Python automation scripts to extract and consolidate data from Shopify and marketing APIs, reducing manual reporting effort by 70%.
  • Data Warehousing Support: Integrated distinct data sources (Sales, Marketing, Accounting) to build unified datasets for business intelligence.

Cloud & Data Engineering Projects

FinTech-Grade Data Lakehouse (ACID Compliant)
Stack: AWS S3, Glue (PySpark), Redshift, dbt, Terraform
  • Architected a serverless Lakehouse pattern decoupling storage (S3) from compute (Redshift/Glue).
  • Implemented dbt for transformation logic including Data Quality tests (schema validation, null checks) and documentation generation.
  • Enforced Infrastructure as Code (IaC) using Terraform to provision VPCs, Glue Crawlers, and Redshift clusters, ensuring reproducible environments.
Distributed Data Workflow Orchestrator
Stack: Apache Airflow, Docker, PostgreSQL, Python
  • Deployed a containerized Airflow 2.0 architecture to manage complex DAG dependencies.
  • Wrote custom Python operators for API ingestion, implementing idempotency and backfill strategies to handle upstream data failures without duplication.

Certifications & Credentials

Education

Master of Information Technology | UNSW, Sydney (2019 – 2021)
Specialization: Artificial Intelligence & Database Systems
Bachelor of Technology in Computer Science | Vellore Institute of Technology (2014 – 2018)