Search by job, company or skills

S

GCP Data Architect (Databricks)

10-12 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 24 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Title: GCP Data Architect (Hybrid)

Location: Noida, India

Experience: 10+ Years

About The Role

We are looking for a highly experienced Data Architect to lead the design, architecture, and optimization of large-scale, enterprise-level data platforms. The ideal candidate will have a strong background in Databricks, PySpark, Google Cloud Platform (GCP), Unity Catalog, ETL processes, and Terraform. You will play a key role in delivering scalable, secure, and high-performance data engineering solutions across the organization.

Key Responsibilities

  • Architect and design modern, scalable, and secure data engineering platforms on Google Cloud Platform (GCP), ensuring high availability and performance.
  • Lead the implementation and optimization of ETL/ELT pipelines leveraging Databricks, PySpark, and Unity Catalog for data governance and cataloging.
  • Define and enforce data architecture standards, frameworks, and best practices across projects to ensure consistency, scalability, and security.
  • Lead the end-to-end delivery of enterprise data solutions covering data ingestion, transformation, storage, and consumption layers.
  • Collaborate with stakeholders across the organization to translate business requirements into scalable data solutions.
  • Establish and maintain data governance frameworks, ensuring compliance, data quality, and security across the data lifecycle.
  • Drive the automation and infrastructure deployment using Terraform and other DevOps tools to streamline development and operational processes.
  • Provide technical leadership and mentorship to engineering teams, fostering an environment of continuous learning and improvement.
  • Review and approve designs and implementations, ensuring alignment with architectural vision and best practices.

Required Skills & Expertise

  • 10+ years of experience in data engineering, with at least 3+ years in an architectural role.
  • Hands-on expertise in Databricks, PySpark, and ETL/ELT processes.
  • Proven experience working with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Pub/Sub, etc.
  • Strong working knowledge of Unity Catalog for data governance and cataloging.
  • In-depth understanding of data architecture principles, such as Data Lake, Medallion Architecture, and Data Vault 2.0.
  • Experience with Terraform (Infrastructure as Code) and cloud automation.
  • Familiarity with DevOps/CI/CD pipelines for automating data engineering workflows.
  • Ability to define and execute architecture roadmaps, ensuring alignment with both business and technical goals.
  • Strong communication and leadership skills, with the ability to influence stakeholders and lead cross-functional teams.

Preferred Qualifications

  • GCP Professional Data Engineer or Architect Certification.
  • Experience implementing enterprise data governance frameworks.
  • Knowledge of multi-cloud environments (AWS, Azure) is a plus.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 132344341