
Search by job, company or skills
Job Description:
Job Title:Senior Azure Data Engineer
Shift Timings:1 PM - 10 PM
Location:Remote, India, Work from Home
Experience Required:8 - 12 Years
Job Summary:
We are seeking a highly skilled and experienced Senior Azure Data Engineer to join our team. The ideal candidate will have deep expertise in Microsoft Azure data services, cloud-based data engineering solutions and modern reporting tools. This role involves leading technical projects, designing scalable data architectures, and delivering actionable insights through BI tools using modern data Lakehouse/warehouse architecture. Ideal candidate should have strong experience with data modelling , ETL/ELT pipeline, and performance tuning. Need to work with Data Architect , Business Analysts and Data Scientists to deliver high quality , reliable and secure data products that enabled advanced analytics.
Key Responsibilities:
Lead end-to-end data engineering projects using Microsoft Fabric, Synapse, Azure Data Factory, Azure Databricks, and ADLS Gen2.
Build and optimize lakehouse and warehouse with best practices.
Design and implement scalable data pipelines and ETL/ELT processes.
Define and enforce data quality ,data governance and metadata – driven frameworks.
Develop and optimize data models (star schema, facts, dimensions) in Microsoft cloud.
Manage incremental load, CDC, and real time streaming pipelines.
Ensure data solution comply with security , privacy, and compliance standards (GDPR, HIPAA,etc.)
Create dashboards and reports using reporting tool Power BI and Azure Analysis Services.
Work with structured , semi-structured and unstructured data and derive insights from data generated.
Implement CI/CD pipelines using Azure DevOps and GitHub and project management.
Collaborate with cross-functional teams including business stakeholders, developers, and analysts.
Drive digital transformation initiatives and business process automation using Azure cloud and Power Platform.
Leverage latest AI model to train LLM /SLM models and use NLP to Q&A data
Mentor junior team members and manage client communications.
Technical Skills Required:
Candidate prescreening should cover, Data Sources[On prem , Azure SQL] and API Integration in Synapse ,Fabric[PySpark] and Strong SQL experience, Data Understanding on large scale project, Good Communication skill and client handling skill.
Job ID: 147492813
Skills:
Azure Synapse, Azure Data Factory, Pyspark, Spark, Kafka, Databricks, Data Modeling, Python, Sql, Azure DataFactory, ADLS
Skills:
Pyspark, Kafka, Azure Databricks, Data Integration, ELT, Nosql, Azure Synapse Analytics, Azure Data Lake, Microsoft Azure, Data Governance, Data Transformation, Hadoop, Workflows, Sql, Azure Data Factory, Spark, Data Warehousing, Data Security, Etl, Data Quality Frameworks, Azure SQL Database, Delta Live Tables, SQL Warehouse, Structured Streaming, Azure Blob Storage, Compliance Standards, Unity Catalog, Data Pipeline Architecture
Skills:
Data Modelling, Azure Data Lake, data pipelines, Azure Synapse Data Connectors, Synapse Dedicated SQL Pool, security configuration, Pyspark Notebooks, CI CD pipelines, Serverless Pool, data migration planning and implementation, Data Engineering best practices, Azure Synapse Pipelines
Skills:
Devops, Adf, Pyspark, Databricks, Azure, Python, P G frameworks like AI factory, Ultimate Pygentic, ADLS2
Skills:
Azure Infrastructure, Pyspark, Azure Databricks, Big Data Technologies, Azure Sql, Sql, Azure Data Factory, Microsoft Azure, Python, Azure Dev Ops, Azure Data Lake Storage, relational and dimensional modelling, HD Insights, ML Service
We don’t charge any money for job offers