Job Title: Director – Data Engineering | Azure Databricks | Healthcare GCC | Hyderabad
About the Company
We are hiring on behalf of a fast-growing US-based Revenue Cycle Management (RCM) company establishing its Global Capability Centre (GCC) in Hyderabad. This is a rare opportunity to join as a founding engineering leader and build the data engineering function from the ground up, working directly with US leadership.
About the Role
We are looking for a Director – Data Engineering to own and scale the enterprise data platform for a healthcare RCM GCC in Hyderabad. You will lead a 40–60 person engineering organisation, manage Engineering Managers, and be accountable for the end-to-end data platform — from lakehouse architecture and pipeline reliability to governance, quality, and AI/ML enablement.
This is a hands-on leadership role. You will set the technical direction, define engineering standards, and work closely with US counterparts to align the India platform with enterprise data strategy.
What You Will Do
- Own the enterprise data platform built on Azure + Databricks Medallion Lakehouse architecture — reliability, performance, cost, and roadmap
- Lead and grow a team of 40–60 data engineers across multiple squads, with Engineering Managers as direct reports
- Define and enforce data engineering standards — Clean Architecture, code review culture, CI/CD pipelines, technical debt management
- Drive data governance using Unity Catalog, data lineage, data quality frameworks, and HIPAA-compliant data management
- Architect and optimise ETL/ELT pipelines on Azure Data Factory, Databricks, and PySpark at enterprise scale
- Enable AI/ML and GenAI workloads — feature stores, vector databases, RAG pipelines — on the data platform
- Partner with US Data Engineering, Product, and AI/ML leadership to align platform capabilities with RCM business outcomes
- Own FinOps for the data platform — Databricks cluster optimisation, cost governance, workload isolation
- Hire, develop, and retain top data engineering talent as the GCC scales
What We Are Looking For
Must Have
- 15–22 years of experience with at least 5 years in a Director or Senior Director – Data Engineering role
- Hands-on production ownership of Azure + Databricks — Medallion Lakehouse, Delta Lake, Unity Catalog
- Proven experience managing Engineering Managers with team sizes of 40+
- Deep expertise in ETL/ELT pipelines — Azure Data Factory, PySpark, Airflow, Kafka
- Experience in Healthcare, Pharma, or RCM data environments — HIPAA compliance, HL7/FHIR/EDI preferred
- Strong data governance background — Unity Catalog, Collibra, data lineage, data quality at scale
- Experience building or scaling a GCC, CoE, or offshore engineering centre
Preferred
- Familiarity with AI/ML enablement on data platforms — feature engineering, MLflow, vector databases
- Exposure to healthcare RCM workflows — claims, denials, prior auth, 837/835 EDI files
- Experience with FinOps for Databricks workloads
Why This Role
- Ground-floor opportunity to build a data engineering organisation from scratch
- Direct visibility with US C-suite leadership
- Hyderabad GCC with strong investment and growth mandate
Location: Hyderabad, India (On-site)
Experience: 15–22 years
Employment Type: Full-Time, Permanent