
Search by job, company or skills

This job is no longer accepting applications
We are scaling a strategic Digital Operations capability and seeking a pragmatic, delivery-focused Data Engineer to design, build and operate end-to-end data platforms that drive product-performance analytics.
Key responsibilities
Engineer robust ETL/ELT pipelines and data models to ingest, transform and surface large volumes of structured & unstructured product and customer data.
Own data lake migrations and One-Lake availability; operationalize data for cross-functional stakeholders.
Implement CI/CD for data deployments and maintain pipeline observability and data quality.
Deliver dashboards and actionable visual insights to Product, Marketing and Engineering teams.
Collaborate across Digital Ops, BI and Regional Engineering; mentor junior engineers and codify best practices.
Must-have skills & experience
35 years in data engineering.
Strong PySpark and SQL proficiency.
Hands-on with Microsoft Fabric and Azure services (Data Lake, Synapse, Data Factory, Databricks).
Experience with CI/CD for data workflows (Azure DevOps / GitHub Actions).
Solid data modelling, warehousing and data governance fundamentals.
Effective stakeholder communication and a drive for measurable outcomes.
Nice-to-have
Relevant Azure/data certifications (e.g., DP-700).
Exposure to product performance analytics or industrial data domains
Notice Period: Immediate joiners preferred
Job ID: 131879509