The GCP Data Engineer will be responsible for designing, implementing, and managing data solutions on Google Cloud Platform. The role requires extensive experience in data platforms, cloud-native data engineering tools, and programming languages, with a focus on enabling business intelligence and data analytics initiatives.
Key Responsibilities
- Design, develop, and optimize data pipelines using GCP services such as BigQuery, Dataflow, Composer, and Pub/Sub.
- Build and maintain scalable data architectures to support analytics, reporting, and BI initiatives.
- Develop and maintain ETL processes, ensuring data quality, consistency, and performance.
- Implement infrastructure as code using Terraform and manage source control with Git.
- Write robust Python and SQL code for data processing, transformation, and analysis.
- Collaborate with business teams to document data models, pipelines, and processes for BI enablement.
- Monitor and troubleshoot data workflows and optimize performance across cloud environments.
Experience Requirements
- 10+ years of experience in data platform engineering.
- 5+ years of hands-on experience with GCP data services.
- Proven expertise in BigQuery, Dataflow, Composer, and Pub/Sub.
- Strong programming experience in Python and SQL.
- Hands-on experience with Terraform and Git for infrastructure and code management.
- Familiarity with business intelligence processes and documentation.