4-9 years experience on Data Engineering role working with Databricks & Azure Cloud technologies.
Bachelor's degree in computer science, Information Technology, or related field.
Strong proficiency in PySpark, Python, SQL.
Strong experience in data modeling, ETL/ELT pipeline development, and automation
Proficient in working on Azure cloud components Azure Data Factory, Azure DataBricks, Azure Data Lake etc.
Experience with data modeling, ETL processes, Delta Lake and data warehousing.
Experience on Delta Live Tables, Autoloader & Unity Catalog.
Preferred - Knowledge of the insurance industry and its data requirements.
Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
Excellent problem-solving skills and ability to work under tight deadlines.
Hands-on experience with performance tuning of data pipelines and workflows
Excellent communication and problem-solving skills to work effectively with diverse teams