Job Title: GCP Data Engineer
Location: Chennai
Experience: 5-12 Years
Job Summary
We are looking for a skilled
GCP Data Engineer / Data Platform Specialist to design, build, and maintain scalable data pipelines and cloud-native data platforms on Google Cloud. The ideal candidate will have strong experience in
data engineering, ETL pipeline development, and cloud infrastructure automation, with hands-on expertise in GCP services and modern DevOps practices.
Key Responsibilities
- Design and develop scalable data pipelines using GCP services such as BigQuery, Dataflow, and Dataproc
- Build and maintain batch and real-time data processing systems using PySpark and Python
- Develop and orchestrate workflows using Apache Airflow
- Implement and manage data integration pipelines using Data Fusion
- Work with Cloud SQL and PostgreSQL for data storage and transformation
- Develop and consume REST APIs for data ingestion and integration
- Automate infrastructure provisioning using Terraform
- Implement CI/CD pipelines using Tekton or similar tools
- Ensure data quality, performance optimization, and reliability of pipelines
- Collaborate with cross-functional teams including Data Scientists, Analysts, and DevOps engineers
- Monitor, troubleshoot, and optimize data workflows in production environments
Required Skills- Strong hands-on experience with Google Cloud Platform (GCP)
- BigQuery
- Dataflow
- Dataproc
- Data Fusion
- Cloud SQL
- Expertise in Python and PySpark for data processing
- Experience with Apache Airflow for workflow orchestration
- Strong knowledge of PostgreSQL and database concepts
- Experience in Infrastructure as Code (Terraform)
- Exposure to CI/CD tools like Tekton
- Experience working with APIs (REST-based integrations)
- Solid understanding of data engineering concepts, ETL/ELT, and data modeling