Job Description
We are seeking an experienced
Senior Data Engineer to design, architect, and optimize scalable data platforms using
Microsoft Fabric. The role focuses on building enterprise-grade
Lakehouse and Data Warehouse solutions following the
Medallion architecture, developing robust data pipelines from
SQL Server (on-prem) sources, and enabling analytics and reporting at scale. This role requires a blend of technical expertise, problem-solving skills, and the ability to work collaboratively with cross-functional team. You'll mentor engineers, drive best practices, and partner with product/BI leaders to evolve the analytics platform.
Key Responsibilities: -
- Architect and implement end-to-end Lakehouse and Data Warehouse solutions using Microsoft Fabric
- Design and implement enterprise-grade Lakehouse and Data Warehouse solutions following the Medallion Architecture
- Establish and enforce data engineering standards for data modelling, pipelines, naming conventions, security, and lifecycle management
- Lead delivery of complex, high-impact data initiatives, including large-scale ingestion, near-real-time pipelines, and performance-optimized analytics workloads
- Ensure the platform meets enterprise requirements for scalability, reliability, cost efficiency, and performance
- Champion data governance & security (Purview classifications, RBAC/ABAC, PII handling, row-level/object-level security).
- Design and enforce CI/CD strategies across environments; orchestrate releases and incident/problem management.
- Act as a trusted advisor to Product, BI, and Analytics leaders on data strategy, SLAs, and roadmap planning
Technical Skill: -
- Bachelor's or Master's degree in Computer Science, IT, or equivalent experience.
- 8 - 10+ years of experience in data engineering and modern data platforms
- Proven hands-on expertise with Microsoft Fabric, including Lakehouse, Warehouse, Pipelines, Notebooks, Dataflows, and Semantic Models
- Strong background in SQL Server and T-SQL, with experience integrating on-premises and cloud data sources
- Deep understanding of modern data architecture patterns, including Medallion Architecture, CDC, and SCD modelling
- Demonstrated experience delivering enterprise-scale, high-volume data platforms
- Strong track record in data governance, security, DevOps, and cost optimization
- Ability to communicate complex technical concepts to executive and non-technical stakeholders
- Azure experience preferred, exposure to other cloud platforms (AWS, GCP) is a plus
- Nice to have Certifications like DP-600, DP-203, AZ-305
All offers and/or employment contracts are contingent upon the successful completion of the Firm's pre-employment screening process. This process may include verifying the candidate's identity, confirming legal authorization to work in the offered position's location, and conducting a comprehensive background check, where permitted by local regulations. We use limited AI‑assisted tools for administrative screening purposes only - never for decision‑making. All hiring decisions are made by people. Applicants may have rights to information and explanations regarding the use of such tools, or request human review, as required by applicable regional laws.