
Search by job, company or skills
Job Specs :
Position GCP Data Engineer
Relevant Experience- 4 to 6 years
Location Options Pune / Gurgaon / Bangalore [ Manyata Tech Park, Nagavara]
Skill GCP Ecosystem Data Flow, BigQuery, Data Engineer (Spark, DataProc), Python, SQL
JOB DESCRIPTION:
Technical Skills:
Hands-on expertise with GCP, especially BigQuery and Dataproc.
Strong proficiency in SQL and Spark (PySpark/Scala).
Experience with orchestration tools and pipeline automation.
Knowledge of CI/CD, cloud monitoring, logging, and alerting.
Understanding of data security, access controls, and cost optimization on GCP.
Preferred:
Experience with Composer, Dataflow, Pub/Sub, or similar services.
Exposure to enterprise data platforms and large datasets.
GCP Professional Data Engineer certification is a plus.
Key Responsibilities:
Design, build, and optimize scalable data pipelines on GCP.
Develop and manage batch and large-scale data processing solutions using Dataproc (Spark).
Architect and optimize BigQuery datasets, schemas, and queries for performance and cost.
Implement data quality checks, monitoring, and performance tuning.
Collaborate with business and analytics stakeholders to translate requirements into data solutions.
Contribute to best practices, documentation, and mentoring junior engineers
Job ID: 145315127