Search by job, company or skills

EXL

Azure Data Engineer With Insurance Domain

new job description bg glownew job description bg glownew job description bg svg
  • Posted 23 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT.

Build and maintain data integration workflows from various data sources to Snowflake.

Write efficient and optimized SQL queries for data extraction and transformation.

Work with stakeholders to understand business requirementsespecially within insurance processes such as policy, claims, underwriting, billing, and customer dataand translate them into technical solutions.

Monitor, troubleshoot, and optimize data pipelines for performance and reliability.

Maintain and enforce data quality, governance, and documentation standards.

Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment.

Must-Have Skills: Strong experience with Azure Cloud Platform services.

Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines.

Proficiency in SQL for data analysis and transformation.

Hands-on experience with Snowflake and SnowSQL for data warehousing.

Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse.

Experience working in cloud-based data environments with large-scale datasets.

Mandatory: Strong insurance domain knowledge, including understanding of policy administration, claims processing, underwriting workflows, actuarial data, and regulatory/compliance standards (e.g., IRDAI, HIPAA where applicable).

Good-to-Have Skills: Experience with DataStage, Netezza, Azure Data Lake, Azure Synapse, or Azure Functions.

Familiarity with Python or PySpark for custom data transformations.

Understanding of CI/CD pipelines and DevOps for data workflows.

Exposure to data governance, metadata management, or data catalog tools.

Knowledge of business intelligence tools (e.g., Power BI, Tableau).

Qualifications: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field.

6+ years of experience in data engineering roles using Azure and Snowflake.

Strong problem-solving, communication, and collaboration skills.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 135798569