Data Engineer
Design, develop, and maintain scalable ETL pipelines using Azure Data Factory (ADF).
- Architect and implement data integration solutions using Azure services such as Storage Accounts, Functions, Logic Apps, VMs, and Key Vault.
- Develop robust and optimized SQL queries and stored procedures for data transformation and reporting.
- Build and maintain Python-based data processing scripts for advanced ETL logic and automation.
- Implement CI/CD pipelines using Azure DevOps (ADO) and/or Git for automated deployment and version control.
- Design and develop shell scripts for automation and orchestration tasks.
- Handle large-scale data files and implement change data capture (CDC) and delta processing strategies.
- Schedule and monitor jobs using Azure-native or third-party scheduling tools.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
- Must be a strong team player and promote team collaboration / activities in the team
- Be an active participant and a positive player in all initiatives or tasks.
- Collaborate with team(s) and with leadership on project status, updates, risks, gaps or feedback on all initiatives or tasks.
Mandatory Skills
Azure Data, SQL and ETL technologies
ETL pipelines building and optomizing
Azure Data Factory (ADF)
Python - Strong in Development
Strong in SQL query
CI/CD, Azure DevOps and Git-based deployment
Shell Scripting
Functions, Logic Apps, VMs, Key Vault
Informatica PowerCenter