A recruitment & talent solutions firm partnering with enterprise clients in Cloud Data Engineering, analytics, and SaaS platforms is hiring for an on-site GCP-focused data engineering role in India. This engagement supports large-scale analytics, real-time streaming, and cloud-native data platform builds for enterprise customers.
Primary title (standardized): Senior GCP Data Engineer
Role & Responsibilities
- Design and implement scalable batch and streaming ETL/ELT pipelines on GCP using Dataflow and Apache Beam.
- Build, optimise and maintain data models and analytics pipelines in BigQuery to support BI and ML workloads.
- Develop and operate event-driven ingestion using Pub/Sub and Cloud Storage with robust error-handling and replay strategies.
- Author and maintain Airflow workflows (Composer) for orchestration, scheduling, and automated retries.
- Implement CI/CD, observability, cost controls and performance tuning for data services; collaborate with platform teams for infra-as-code rollouts.
- Work closely with Data Scientists and Analytics teams to productionise models, enable feature stores, and ensure data quality and lineage.
Skills & Qualifications Must-Have
- Google Cloud Platform
- BigQuery
- Dataflow
- Pub/Sub
- Apache Beam
- Airflow
- SQL
- Python
Preferred
- Terraform
- Kubernetes (GKE)
- Dataproc
Benefits & Culture Highlights
- On-site collaboration with cross-functional engineering and data teamsstrong exposure to enterprise-scale cloud implementations.
- Opportunities for GCP certification support and technical career growth across analytics and ML platforms.
- Fast-paced delivery environment with emphasis on engineering ownership, automation, and measurable impact.
Location: India (On-site). This role is tailored for hands-on GCP data engineers who enjoy building reliable, high-performance data platformsapply if you want to own end-to-end cloud data solutions and accelerate analytics at scale.
Skills: python,gcp,sql,airflow,google cloud platform