Search by job, company or skills

  • Posted 2 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Title: Data Architect GCP

Location: Pune / Bangalore / Chennai / Hyderabad

Experience: 10+ Years

About The Role

We are looking for a highly experienced Data Architect with deep expertise in Google Cloud Platform (GCP) to design scalable cloud data solutions, lead modernization initiatives, and provide architectural oversight across enterprise data systems. The ideal candidate must bring strong experience in the retail domain, cloud migrations, and advanced data modeling for large-scale analytics.

Key Responsibilities

  • Design and architect scalable, secure, and high-performing data platforms on GCP.
  • Lead end-to-end cloud migration initiatives from on-premise platforms to GCP.
  • Define architecture standards, governance frameworks, security policies, and best practices.
  • Develop conceptual, logical, and physical data models aligned to business requirements.
  • Collaborate with engineering, analytics, BI, and product teams to ensure seamless data flow.
  • Optimize ETL/ELT pipelines, data storage, compute resources, and overall cloud performance.
  • Evaluate and implement GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Looker, Cloud Composer.
  • Provide technical leadership in solution blueprinting, architectural decisions, and documentation.
  • Ensure data quality, lineage, metadata management, and compliance across platforms.
  • Work closely with business and analytics teams on retail domain use cases such as product catalog, supply chain, customer 360, demand forecasting, and sales analytics.

Required Skills & Experience

  • 10+ years of experience in Data Architecture, Data Engineering, or similar fields.
  • Strong hands-on experience in GCP Cloud Architecture, data modernization, and migration projects.
  • Expertise in BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Composer, and other GCP native solutions.
  • Demonstrated experience in data modeling (Kimball, Inmon, Data Vault).
  • Strong background in ETL/ELT tools such as DBT, Informatica, Talend, Data Fusion, or similar.
  • Proficiency in Python, SQL, and distributed processing frameworks like Apache Spark.
  • Experience working with retail domain datasets, analytics models, and reporting frameworks.
  • Solid understanding of data governance, security, lineage, metadata, and compliance standards.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 135643753

Similar Jobs