Roles and Responsibilities:
- Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services like BigQuery, Dataflow, PubSub, Dataproc, and Cloud Storage.
- Collaborate with cross-functional teams to gather business requirements and design appropriate data solutions.
- Develop complex SQL queries to extract insights from large datasets in Google Cloud SQL databases.
- Troubleshoot and resolve issues in data processing workflows.
Desired Candidate Profile:
- 59 years of experience in Data Engineering with expertise in GCP and BigQuery.
- Strong understanding of GCP platform administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, and Cloud SQL.
- Experience with big data analytics projects involving ETL processes using tools like Airflow or similar.