Job Description
Role & responsibilities
- Design, develop, and maintain ETL processes using Azure Data Factory (ADF), SSIS, and Databricks.
- Collaborate with data architects, analysts, and other stakeholders to understand data requirements and translate them into technical solutions.
- Build and optimize data pipelines to ensure efficient data ingestion, transformation, and storage.
- Monitor and troubleshoot data integration processes to ensure data quality and performance.
- Implement data governance and security best practices within Azure and ETL frameworks.
- Conduct data profiling and analysis to identify trends and anomalies.
- Document processes, data flows, and technical specifications for reference and compliance.
- Stay current with Azure updates and emerging technologies to recommend enhancements.
Preferred Candidate Profile
- 5+ years of experience in data engineering or related roles, with a focus on Azure and ETL technologies.
- Proficiency in Azure Data Factory, Azure Databricks, and SQL Server Integration Services (SSIS).
- Strong understanding of ETL concepts and best practices.
- Experience in stored procedures
- Experience with data modeling, data warehousing, and big data technologies.
- Knowledge of programming languages such as SQL, Python, or Spark.
- Familiarity with Azure cloud services (e.g., Azure SQL Database, Azure Blob Storage).
Skills: stored procedures,ssis,big data,t-sql stored procedures,adf,adb,azure,etl