Job Summary
We are seeking a highly skilled
Senior Data Engineer with strong hands-on experience in
Google Cloud Platform (GCP) and deep expertise in
BigQuery. The ideal candidate will design, develop, and optimize scalable data pipelines and modern data warehouse solutions to support large-scale analytics and business intelligence initiatives.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using GCP services
- Build, manage, and optimize BigQuery data warehouses for high-performance analytics
- Develop ETL/ELT pipelines using tools such as Dataflow, Dataproc, and Cloud Composer (Airflow)
- Perform advanced query optimization, partitioning, and clustering in BigQuery
- Integrate data from multiple sources including APIs, streaming platforms, and batch systems
- Implement data quality checks, governance frameworks, and security best practices
- Collaborate with business stakeholders to translate requirements into technical solutions
- Monitor, troubleshoot, and ensure high availability of data pipelines
- Optimize cost and performance of GCP resources
Required Skills- 5+ years of experience in Google Cloud Platform (GCP)
- 6+ years of hands-on experience with BigQuery (mandatory)
- Strong expertise in:
- BigQuery (advanced SQL, performance tuning, partitioning, clustering)
- Cloud Storage and Pub/Sub
- Dataflow / Dataproc
- Cloud Composer (Airflow)
- Strong SQL and data modeling skills
- Proficiency in Python / PySpark / Java / Scala
- Hands-on experience building ETL/ELT pipelines
- Solid understanding of data warehousing concepts
Preferred Skills
- Experience with real-time/streaming pipelines (Pub/Sub, Dataflow)
- Exposure to BI tools such as Looker or Data Studio
- Knowledge of CI/CD, Terraform, or Infrastructure as Code (IaC)
- Experience working in multi-cloud environments (AWS/Azure)
Skills: gcp,big query,python