Role - Associate Director - Delivery
Lead a team supporting global pharma R&D, Commercial, and Manufacturing use-cases. Own data strategy execution, modernize data platform development, and delivery excellence for a Global Capability Center (GCC). Drive scalable data products, governance, and analytics solutions.
Location - Hyderabad
Leadership & Strategy
- Own Data Engineering roadmap for enterprise data transformation; align with global data & digital strategy.
- Lead multi-squad engineering teams (Data Engineering, MLOps, Reporting).
- Establish standards for ingestion, transformation, quality, cataloging and lineage.
- Partner with global product owners, business stakeholders, and platform leads across R&D, HEOR, Commercial, Supply Chain.
Technical Delivery
- Architect and scale cloud-native data platforms
- Oversee ingestion pipelines across structured (clinical, commercial, manufacturing) and unstructured data (medical notes, literature, NLP outputs).
- Ensure high-performance ELT/ETL, orchestration, CI/CD, and MLOps practices.
- Manage Lakehouse / Warehouses (Databricks, Snowflake, Dataiku).
- Implement robust data quality KPIs, reconciliation frameworks
Domain Expectations
- Deep understanding of pharma data:
- Commercial data (IMS/IQVIA, specialty pharmacy feeds)
- Clinical trial data (CDISC, SDTM, ADaM)
- RWE/claims/EHR, patient support program data
- Manufacturing & QC datasets
- Experience delivering data foundations for regulatory submissions, pharmacovigilance, digital biomarkers, omnichannel analytics.
GCC Focus
- Run onsiteoffshore governance, delivery dashboards, and stakeholder reporting.
- Drive talent development, vendor management, and operating-model improvements.
- Set up reusable accelerators, playbooks, and delivery frameworks tailored for GCC scale.
- Engage with GCC stakeholders as well as global stakeholders delivering customer delight
Required Experience
- 1215 years in Data Analytics / Data engineering, with 5+ years in pharma/biotech.
- 35 years in a leadership role managing 1520 engineers.
- Proven track record delivering multi-million-dollar programs in global matrixed environments or GCCs.
- Certification or hands-on experience in Snowflake, Azure/AWS, and modern orchestration (Airflow, ADF
Technical Skillset
Python, Spark, SQL (advanced), Snowflake, Airflow / ADF / Dagster
Tools: Soda, Collibra, Informatica DQ, Airflow, Jenkins, Alteryx
Bonus
- Generative AI pipelines, vector databases, feature stores
- Experience enabling ML models for clinical analytics or commercial forecasting
- Understanding of FAIR data principles, data mesh, and metadata-first architectures
Education
- Bachelors/master's in engineering, Computer Science, Data Science, or equivalent.
- Certifications in cloud/data platforms preferred.