Search by job, company or skills

Quarks Technosoft

Principal Data Architect

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 11 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Location: Noida, Pune, Jaipur, Bangalore

Experience: 12+ years

Notice Period: Immediate to 30 days

Qualification: B.Tech / M.Tech / BCA / MCA or any relevant technical field (candidates from premium institutes preferred)

We are looking for a Principal Data Architect to lead the design and implementation of scalable, cloud-native data platforms for enterprise analytics and data-driven decision-making. This is a core data role with a strong focus on architecture, engineering, and governance.

Key Responsibilities:

  • Define end-to-end enterprise data architecture across ingestion, storage, transformation, governance, and analytics
  • Own the end-to-end lifecycle of a live, customer-facing data platform, ensuring scalability, reliability, and continuous delivery.
  • Demonstrated experience in handling production failures, driving root cause analysis, and implementing resilient architectures.
  • Design scalable, serverless-first data platforms on GCP (multi-cloud exposure is a plus)
  • Drive cost governance and optimization (FinOps), ensuring efficient resource utilization without compromising performance.
  • Architect batch and real-time data pipelines using Kafka, Pub/Sub, and CDC frameworks
  • Lead data engineering practices using Dataflow, dbt, and Airflow with Lakehouse architecture (Medallion pattern)
  • Design and optimize enterprise data warehouse solutions on BigQuery
  • Guide data modeling approaches including Kimball, Data Vault, and Inmon methodologies
  • Ensure data quality, governance, security, and compliance (including PII and regulatory standards)
  • Provide foundational support for downstream advanced analytics (exposure to AI/ML is good to have)
  • Mentor engineering teams and drive architectural best practices across the organization

Key Skills:

  • GCP stack: BigQuery, Dataflow, Pub/Sub, GCS
  • Streaming: Apache Kafka, CDC tools (Debezium/Datastream)
  • Data Engineering: dbt, Spark, Airflow
  • Databases: Bigtable, MongoDB, Redis, PostgreSQL, Spanner
  • Strong SQL and Python skills (Java/Scala is a plus)
  • Experience with data governance, lineage, and observability tools

This role is ideal for candidates with deep expertise in data architecture and engineering, looking to drive enterprise-scale data strategy.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 146636241

Similar Jobs