Company Description
Our client is a full-service technology and transformation partner, helping organizations unlock the full potential of their digital investments.
Their key service areas include:
- Consulting & Advisory: Aligning technology initiatives with business strategy to drive measurable outcomes.
- Managed Services & Support: Ensuring your systems remain reliable, optimized, and secure.
- Data, Analytics & AI: Turning data into actionable insights and building intelligent, automated workflows.
- Business Applications: Delivering robust ERP, CRM, and people management solutions that power core business operations.
- Integration & Automation: Seamlessly connecting systems and streamlining processes to improve efficiency.
- Cloud, Infrastructure & Security: Enabling scalability, resilience, and secure cloud transformations for modern enterprises.
Roles & Responsibilities — Data Engineer (SQL, DBT, Azure, ADF & Microsoft Fabric)
1. Data Engineering & Pipeline Development
- Design and build scalable ETL/ELT pipelines using Azure Data Factory (ADF) and Fabric Data Factory.
- Implement batch, incremental, and CDC-based ingestion from diverse cloud and on‑prem data sources.
- Maintain robust orchestration with scheduling, dependency management, retries, and monitoring.
2. SQL Development & Optimization
- Develop complex SQL transformations, stored procedures, views, and analytical datasets.
- Optimize SQL workloads using indexing, partitioning, statistics management, and execution plan tuning.
- Implement data validation, reconciliation checks, and quality rules using SQL-driven logic.
3. DBT Transformation Engineering
- Build modular, version-controlled data models using DBT (staging, intermediate, marts).
- Implement DBT tests, documentation, macros, and incremental models for scalable ELT processing.
- Maintain lineage, quality checks, and deployment workflows using DBT Cloud or DBT Core + CI/CD.
4. Azure & Microsoft Fabric Engineering
- Develop and manage datasets and curated layers using Microsoft Fabric Lakehouse/Warehouse.
- Implement Delta/Parquet-based data structures, schema evolution, and medallion architecture patterns.
- Configure and optimize Fabric pipelines, Lakehouse tables, Warehouses, and SQL endpoints.
5. Performance, Quality & Monitoring
- Monitor performance across SQL queries, DBT jobs, ADF pipelines, and Fabric workloads.
- Implement automated data quality checks, SLA monitoring, and alerting using Azure Monitor/Log Analytics.
- Troubleshoot bottlenecks and ensure pipeline reliability through optimization and proactive maintenance.
6. Security, Governance & CI/CD
- Apply Azure and Fabric security standards: RBAC, secrets management, RLS/OLS, and workspace governance.
- Maintain Git-based version control and deploy pipelines, DBT models, and Fabric assets via Azure DevOps/GitHub Actions.
- Ensure adherence to governance standards for metadata, lineage, and KPI definitions.
7. Collaboration & Delivery
- Work with BI, analytics, and product teams to translate business requirements into scalable data solutions.
- Provide clean, well-modelled datasets for reporting, analytics, and downstream consumption.
- Document pipelines, models, dependencies, and troubleshooting guidelines for ongoing support.
CTC:10-18 Lakhs
Notice period: Immediate Joiner/1 month notice