We are seeking a hands-on Principal GCP Data Architect to lead the design, and implementation of large-scale data ecosystems.
The ideal candidate is an architect who has led massive cloud migrations, built self-serve data platforms from the ground up, and maintains a rigorous focus on data governance and security.
Technical Requirements:
Cloud Expertise: Expert-level mastery of the GCP Data Stack:
Storage & Warehouse: BigQuery (including BigLake and Omni), Google Cloud Storage.
Messaging: Pub/Sub and Confluent/Kafka integration.
Analytics & AI: Looker, Vertex AI, and BigQuery ML.
Certifications: Must hold an active GCP Professional Data Engineer certification.
DRP ID Performance: A Data Readiness Placement (DRP) ID with a verified score of 50+ is highly preferable, demonstrating a high level of technical proficiency and architectural maturity.
Modern Data Stack: Deep experience with dbt, Airflow, and containerization (GKE/Kubernetes).
Experience & Qualifications:
Years of Experience: 18–20 years in Data Engineering, Data Warehousing, and Business Intelligence, with at least 6+ years focused specifically on GCP.
Migration Track Record: Proven experience leading at least two enterprise-scale migrations (PB-scale) to the cloud.
Leadership: Demonstrated experience leading large, cross-functional engineering teams in an Agile/DevOps environment.
Education: Bachelor's or Master's degree in Computer Science, Information Systems, or a related technical field.