Search by job, company or skills

sagility

Senior Data Engineer - Tech Lead

This job is no longer accepting applications

new job description bg glownew job description bg glownew job description bg svg
  • Posted 5 days ago

Job Description

As a Data Architect within our Enterprise Analytics COE, you will lead the strategic design and governance of our high-performance Cloud Lakehouse. We are seeking a visionary technical leader to architect scalable, Medallion-based data structures on Azure Databricks that bridge the gap between complex raw data and AI-ready insights. Whether your expertise lies in Databricks, Snowflake, or BigQuery, you will be responsible for defining the technical roadmap, implementing robust governance via Unity Catalog, and mentoring engineering teams to build resilient, automated pipelines. This is a high-impact role centered on driving architectural excellence and data reliability across a diverse portfolio of client engagements.

Job Description

Role Overview We are looking for a Data Architect to lead the design and governance of our Enterprise Data Lakehouse. While our primary ecosystem is Azure Databricks, we value architectural expertise across equivalent cloud platforms (Snowflake, BigQuery). You will be responsible for defining how data is structured, secured, and scaled to support BI and AI initiatives.

Key Responsibilities

  • Architectural Design: Design and implement Cloud Lakehouse architectures (Medallion pattern: Bronze/Silver/Gold).
  • Governance & Security: Lead the implementation of Unity Catalog (or equivalent governed catalogs) to manage metadata, lineage, and fine-grained access control.
  • Data Modeling: Create scalable physical and logical data models, ensuring high performance for both batch and real-time streaming (Structured Streaming).
  • Strategic Roadmapping: Evaluate and integrate cloud-native services (Azure Data Factory, Key Vault, etc.) to build a cohesive ecosystem.
  • Mentorship: Act as the technical North Star for data engineers, ensuring code quality and architectural consistency.

Required Skills & Experience

  • Experience: 6–12 years Senior Data Engineering.
  • Cloud Platforms: Expert-level knowledge of Azure (preferred), AWS, or GCP.
  • The Core Engine: Deep experience with Databricks/Spark is ideal. However, we highly value candidates with equivalent expertise in Snowflake or Google BigQuery who understand decoupled storage/compute and cloud-native scaling.
  • Languages: Proficient in SQL and Python (PySpark).
  • Modern Standards: Proven experience with Delta Lake, Parquet, or Iceberg formats.
  • Data Governance: Familiarity with modern discovery and security tools (Unity Catalog, Microsoft Purview, or Collibra).

Preferred Qualifications

  • Experience migrating legacy data warehouses to a Cloud Lakehouse.
  • Certifications: Databricks Certified Data Engineer Professional or Azure Solutions Architect (AZ-305).
  • Knowledge of dbt (data build tool) for modular SQL modeling.

Regards,

Chetan Gurudev

[Confidential Information]

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 146794051

Similar Jobs