Search by job, company or skills

  • Posted 25 days ago
  • Over 50 applicants
Quick Apply

Job Description

Job Title:

Data Engineer Lead

Reports To:

Data Platform Manager

Job Summary

We are looking for a Data Warehouse Engineering Lead to develop and deploy data products through our Azure Analytics Platform. You will work closely with the DWH team, Data Architect and senior stakeholders across the business. It's an exciting time to join us as we're delivering our Data Strategy to support the transformation of our business through innovation & technology.

Main Duties & Responsibilities

The role core duties include but are not limited to:

  • Provide expertise and guidance to your direct reports ensuring that the team can deliver a vital Data Service to the business.
  • Ensure our Data Governance principals are implemented and adhered to and to be an advocate of best practice in Data Product delivery.
  • Build and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc.
  • Work with key stakeholders including Senior management, Developers, Data, and Project teams to assist with data-related technical issues and support their data infrastructure needs.
  • Develop and maintain a group-wide Data Estate incorporating a Data Lake and Data Warehouse using best practice warehouse methodologies.
  • Define data integration and ingestion strategies to ensure smooth and efficient data flow from various sources into the data lake and warehouse.
  • Develop data modelling and schema design to support efficient data storage, retrieval, and analysis.

Key Skills

  • Experience of leading a technical team working in a changing and fast paced environment.
  • Experience with cloud platforms, particularly Azure, and a solid understanding of their data-related services and tools.
  • Proficiency in SQL and one or more programming languages commonly used for data processing and analysis (e.g., Python, R, Scala).
  • Familiarity with data integration techniques, ETL/ELT processes, and data pipeline frameworks.
  • Knowledge of data governance, data security, and compliance practices.
  • Strong analytical and problem-solving skills, with the ability to translate business requirements into scalable and efficient data solutions.
  • Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders.
  • Ability to adapt to a fast-paced and dynamic work environment and manage multiple priorities
  • Strong analytic skills related to working with unstructured datasets.
  • Strong project management and organizational skills.

Working relationships

  • Collaborate closely with data architects and analysts to understand and fulfil their data requirements.
  • Regular interaction with various departments to gather and provide data insights.

Communication:

  • Communicate with leadership and colleagues in relation to all business activities
  • Highly articulate and able to explain complex concepts in bite size chunks
  • Strong ability to provide clear written reporting and analysis

Personal Qualities

  • Ability to work to deadlines
  • Strong problem-solving skills with an emphasis on product development.
  • Excellent written and verbal communication skills.
  • Adept at queries, report writing and presenting findings.

Knowledge / Key Skills:

Essential

Desirable

Cloud & Data Engineering Platforms:

Azure Data Factory, Azure Databricks (including Unity Catalog & Notebooks), Azure Functions, Azure Apps, Azure DevOps Pipelines

Data Transformation & Modelling:

dbt (Data Build Tool) for scalable modelling and transformation in cloud data platforms

Programming & Query Languages:

PySpark, PySQL, T-SQL, MS SQL, Stored Procedures

ETL/ELT & Data Integration:

MS SSIS, WebHooks, RESTful APIs, automated pipelines for batch and streaming data

Data Architecture & Quality:

Data Modelling, Data Analysis, Data Quality frameworks, dimensional modelling

Leadership & Communication:

Proven experience in team management, mentoring cross-training initiatives, and coordinating with diverse business stakeholders

Governance & Compliance:

Data Privacy (GDPR, CCPA), Metadata management, data discovery, data security (RBAC, CLS, RLS)

Infrastructure as Code & Automation:

Terraform, CI/CD via Azure DevOps

Real-Time Data Processing:

Experience with Kafka, Spark Streaming, or Flink for streaming analytics

Identity & Access Management:

Entra ID (Azure AD) for role-based access control, group assignments, PIM management, and user provisioning

Cloud Infrastructure & Operations:

Deployment and monitoring of resources across Azure environments, with hands-on experience in cost optimization, resource governance, subscription management, and environment setup

Certifications:

Azure Administrator [AZ-104]

Fabric Data Engineer [DP-700]

Fabric Analytics Engineer [DP-600]

Azure DevOps Engineer [AZ-400]

Thanks

Mukesh kumar

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

Job ID: 131806231

Similar Jobs