Search by job, company or skills

  • Posted 2 months ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Position Summary

This role involves leading end-to-end delivery of modern data engineering solutions, combining hands-on technical expertise with delivery and team leadership. The role is responsible for designing, building and migrating enterprise-scale data pipelines from legacy platforms (Informatica BDM, TIDAL) to modern cloud-native architectures (Databricks, Apache Airflow), while ensuring data quality, governance and operational stability. This role serves as a critical link between architecture, engineering and delivery teams, ensuring high-quality, timely execution of data initiatives across global programs and orchestrating high-quality data engineering practices with governance adherence.

Job Responsibilities

  • Lead design, development and implementation of scalable data pipelines on Databricks.
  • Build and migrate data pipelines from Informatica BDM to Databricks.
  • Migrate job scheduling from TIDAL to Apache Airflow.
  • Develop data processing logic using PySpark, SQL and Python.
  • Ensure pipelines are reliable, fast and easy to support.
  • Build and manage Airflow DAGs for job scheduling and monitoring.
  • Follow best practices while working with Databricks and Delta Lake.
  • Add data quality checks and basic reconciliation controls.
  • Maintain clear documentation such as designs, mappings and runbooks.
  • Guide and review the work of data engineers.
  • Help break down work into tasks and estimates.
  • Track progress using JIRA and Confluence.
  • Support sprint planning, reviews and delivery tracking.
  • Serve as a technical SME for commercial pharma datasets.
  • Mentor junior engineers and enforce coding and delivery best practices.
  • Coordinate with BAs and PMs for alignment of business priorities with tech delivery.

Education

BE/B.Tech

Master of Computer Application

Work Experience

  • 811 years of experience in data engineering, with 23 years in a Technical Lead / Manager role.
  • Strong hands-on experience with Databricks (PySpark, Spark SQL, Delta Lake), Informatica BDM and Informatica IICS / IDMC, Apache Airflow, SQL, Python
  • Proven experience in Informatica BDM Databricks migration initiatives.
  • Proven experience migrating job scheduling/orchestration from TIDAL to Airflow.
  • Experience with CI/CD pipelines, Git-based version control and environment promotion strategies.
  • Strong understanding of data pipeline performance tuning, error handling and operational support.
  • Experience managing Agile delivery, sprint backlogs and delivery tracking using JIRA.
  • Experience working on large scale data modernization or cloud migration programs.
  • Familiarity with pharma data standards, KPIs and reporting constructs.
  • Prior experience in customer-facing global delivery engagements.
  • Exposure to multi-vendor or onshore-offshore delivery models.
  • Knowledge of data reconciliation frameworks and audit readiness in regulated environments.

Behavioural Competencies

Teamwork & Leadership

Motivation to Learn and Grow

Ownership

Cultural Fit

Talent Management

Technical Competencies

Problem Solving

Lifescience Knowledge

Communication

Project Management

Capability Building / Thought Leadership

Informatica

Databricks

More Info

Job Type:
Industry:
Employment Type:

Job ID: 138098901