Key Responsibilities:
ETL & Data Pipeline Development
- Design, build, and maintain ETL/ELT pipelines using Azure Data Factory (ADF), SSIS, and Microsoft Fabric
- Extract data from diverse sources, transform, and load into data warehouses and lakehouses
- Structure and maintain enterprise-level integrations using Snowflake, Azure Synapse, Azure SQL, and SQL Server
Data Engineering & Cloud Technologies
- Strong experience with Azure Data Lake Storage Gen2, Azure Databricks, Apache Spark (PySpark), and Logic Apps
- Exposure to real-time data processing and streaming with Apache Kafka and Azure Event Hubs
- Hands-on with data modeling, normalization/denormalization, and relational database design
Programming & Scripting
- Proficient in Python, Scala, PowerShell, JSON, and XML for automation, scripting, and data manipulation
- Implement CI/CD pipelines using DevOps practices and Infrastructure as Code
Analytics & Reporting
- Develop and maintain analytics and reporting solutions using Microsoft Power BI
- Ensure data quality, reliability, and performance tuning of SQL queries and pipelines
Advanced & Emerging Skills (Preferred)
- Experience with ML/AI and GenAI frameworks like LangChain, LangGraph, OpenAI SDK, Microsoft Agent Framework
- Knowledge of GDPR compliance, data security, and enterprise-level automation