Search by job, company or skills

P

Insurance cloud Architect

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 13 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Senior Data Engineer (Databricks | Insurance | Data Lakehouse)

Location: Mumbai (Preferred) / Bangalore

Experience: 7–12 years

Start: Immediate Joiner Preferred

Interested candidates please fill below form:

https://forms.cloud.microsoft/r/g7syaaUAxZ

Please Send your resume to [Confidential Information]

Role Overview

We are looking for a highly skilled Senior Manager / Manager /Senior Data Engineer with deep expertise in Databricks data management, logical & physical data modelling, and insurance domain data workflows. The candidate will work on a strategic data migration initiative for a leading UK-based insurance company, moving data from Guidewire into Databricks Silver and Gold layers with strong governance, lineage, and scalability standards.

Key Responsibilities

Databricks Data Engineering & Management

  • Design, build, and optimize Silver and Gold layer data pipelines in Databricks using PySpark, SQL, Delta Lake, and Workflow orchestration.
  • Implement data quality, lineage, schema evolution, and governance controls across curated layers.
  • Optimize Databricks jobs for performance, scalability, and cost efficiency.

Guidewire → Databricks Migration

  • Lead the end-to-end migration of large-scale insurance data from Guidewire PolicyCenter/ClaimCenter/BillingCenter into Databricks.
  • Map and transform complex Guidewire entity structures into normalized and star-schema models.

Data Modelling & Architecture

  • Develop robust logical and physical data models aligned to insurance business processes.
  • Build high-quality curated data marts (Gold) for analytics, reporting, pricing, underwriting, and claims.
  • Define standards for metadata, naming conventions, partitioning, and model documentation.

Insurance Domain Expertise

  • Understand core insurance data entities such as policy, claims, billing, customer, underwriting, rating, and product hierarchies.
  • Apply domain knowledge to rationalize Guidewire data structures and create business-ready datasets.

Solutioning & Ideation

  • Collaborate with client SMEs, architects, and business analysts to shape data solutions and propose design improvements.
  • Ability to ** ideate, simplify complex data flows**, and contribute to overall solution architecture.

Required Skills & Experience

Technical

  • 7–12 years of experience in data engineering, data modelling, and data management.
  • Strong hands-on experience in Databricks, Delta Lake, PySpark, Spark SQL, and ETL/ELT pipelines.
  • Expertise in logical & physical data modelling (3NF, Star Schema, Data Vault preferred).
  • Practical knowledge of Guidewire data model and prior migration experience (mandatory).
  • Experience working with large-scale insurance datasets
  • Strong understanding of data quality frameworks, lineage, cataloging, and governance.

Soft Skills

  • Strong problem-solving and conceptualization / ideation capability.
  • Excellent communication and stakeholder-management for UK client environment.
  • Ability to work in fast-paced delivery tracks with cross-functional global teams.

Preferred Qualifications

  • Certifications in Databricks, Azure/AWS, and Data Warehousing are added advantages.
  • Experience delivering enterprise-grade data lakes or lakehouse architectures.

Why Join This Role

  • Work on a flagship insurance data modernisation project for a top UK carrier.
  • Opportunity to shape enterprise-scale data models on the Databricks Lakehouse.
  • High-visibility role with strong career growth in insurance data engineering.

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 146835683

Similar Jobs