Position Details
- Job Title: Senior Azure Data Engineer
- Type: Contract (Remote)
- Work Timings: 9:30 AM 5:30 PM IST
Experience
- Total Experience: 8+ Years
- Relevant Experience: Minimum 68 years in Azure Data Engineering
Job Summary
We are seeking a highly skilled Senior Azure Data Engineer to design and develop scalable, real-time data pipelines on Microsoft Azure. The ideal candidate will have deep expertise in Azure data services, strong programming skills in Python and SQL, and hands-on experience with real-time streaming architectures. Experience in financial domains, especially CFD/FX trading platforms (MT4/MT5), will be an added advantage.
Key Responsibilities
- Design and implement real-time data ingestion and transformation pipelines using Azure Event Hubs, Azure Databricks, and Azure Stream Analytics.
- Develop and maintain ETL/ELT pipelines using Azure Data Factory, Logic Apps, Databricks, and Synapse Analytics (Dedicated SQL Pools).
- Write, optimize, and maintain complex SQL queries and Python-based data processing scripts.
- Architect and manage dimensional data models and enterprise-grade data warehouse solutions.
- Ensure data quality, monitoring, observability, and validation across both real-time and batch pipelines.
- Collaborate with cross-functional teams (product, analytics, engineering) to gather requirements and deliver data solutions.
- Implement data governance, security, and compliance aligned with organizational policies.
- Optimize performance and scalability for high-volume financial and trading datasets.
- (Nice to Have) Integrate data pipelines with financial platforms such as MT4/MT5.
Required Skills- Strong experience with Microsoft Azure services:
- Azure Event Hubs
- Azure Data Factory
- Azure Logic Apps
- Azure Synapse Analytics (SQL Pools)
- Azure Databricks
- Proficiency in Python and SQL for data engineering and transformation logic.
- Hands-on experience with real-time streaming pipelines and event-driven architectures.
- Strong understanding of data warehousing concepts and dimensional modeling.
- Experience working with large-scale structured and semi-structured datasets.
- Knowledge of performance tuning and optimization techniques.
- Familiarity with DevOps practices, CI/CD pipelines, and version control tools like Git.
Nice to Have
- Experience in the financial domain, especially trading or brokerage systems.
- Exposure to MetaTrader platforms (MT4/MT5).
- Knowledge of Kafka, Delta Lake, Parquet, and modern data lake architectures.
- Azure Data Engineering certification (DP-203) or equivalent.