Job Title: Databricks - Associate Director
Level - Associate Director
Location - Bangalore
Experience: 11 + YOE
Role Summary -
We are looking for a highly skilled Associate Director Databricks with 11+ years of strong experience in Data Engineering, Cloud Data Platforms, and Enterprise-scale Data Solutions. The ideal candidate will be an expert in Databricks, hands-on with modern data architectures, and experienced in leading teams, managing stakeholders, and delivering endtoend data programs.
Key Responsibilities
- Lead the design, development, and optimization of data pipelines using Databricks, Spark, Python, and SQL.
- Architect and implement Lakehouse & Medallion architecture, Delta Lake frameworks, and scalable ETL/ELT workflows.
- Manage Delta tables, timetravel operations, Unity Catalog governance, and highvolume data processing.
- Drive data modelling, data warehousing best practices, SCD implementation, and performance tuning.
- Lead crossfunctional data engineering teams, provide technical guidance, and oversee execution of large-scale client engagements.
- Own effort estimations, sprint planning, project governance, and delivery quality.
- Support solution architecture, proposal submissions, client presentations, and presales activities.
- Partner with senior stakeholders to define data strategies, roadmap, and future-state architecture.
Required Skills & Expertise
- 11+ years of total experience in Data Engineering.
- Strong programming skills in Python, SQL, and Apache Spark.
- Deep hands-on experience with Databricks (pipelines, notebooks, jobs, workflows, Lakehouse, Delta Lake, Unity Catalog).
- Solid understanding of data modelling, data warehousing, and SCD concepts.
- Proven experience in leading large teams/pods, mentoring engineers, and managing enterprise data programs.
- Excellent communication, client interaction, and stakeholder management skills.
- Experience in proposal creation, solutioning, and presales is a strong advantage.
Preferred Qualifications
- Databricks certifications (DE Associate/Professional, Lakehouse Architect, etc.)
- Experience working with Azure/AWS/GCP cloud environments
- Familiarity with CI/CD for data engineering, orchestration tools, and cloud governance