Search by job, company or skills

A

Senior Tech Lead - Azure Databricks / Microsoft Fabric

10-18 Years
0.05 - 1.05 LPA
new job description bg glownew job description bg glownew job description bg svg
  • Posted 16 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description

Job Title: Senior Tech Lead- Azure Databricks or Microsoft Fabric

Total Exp: 10-18 yrs

Role Overview:
The Senior Tech Lead - Databricks leads the design, development, and implementation of
advanced data solutions. The jobholder has extensive experience in Databricks, cloud
platforms, and data engineering, with a proven ability to lead teams and deliver complex
projects.
Responsibilities:
. Lead the design and implementation of Databricks-based data solutions.
. Architect and optimize data pipelines for batch and streaming data.
. Provide technical leadership and mentorship to a team of data engineers.
. Collaborate with stakeholders to define project requirements and deliverables.
. Ensure best practices in data security, governance, and compliance.
. Troubleshoot and resolve complex technical issues in Databricks environments.
. Stay updated on the latest Databricks features and industry trends.
Key Technical Skills & Responsibilities
. 10+ years of experience in data engineering using Databricks or Apache Spark-based
platforms.
. Proven track record of building and optimizing ETL/ELT pipelines for batch and streaming
data ingestion.
. Hands-on experience with Azure services such as Azure Data Factory, Azure Data Lake
Storage, Azure Databricks, Azure Synapse Analytics, or Azure SQL Data Warehouse.
. Proficiency in programming languages such as Python, Scala, SQL for data processing
and transformation.
. Expertise in Spark (PySpark, Spark SQL, or Scala) and Databricks notebooks for largescale data processing.
. Familiarity with Delta Lake, Delta Live Tables, and medallion architecture for data
lakehouse implementations.
. Experience with orchestration tools like Azure Data Factory or Databricks Jobs for
scheduling and automation.
. Design and implement the Azure key vault and scoped credentials.
. Knowledge of Git for source control and CI/CD integration for Databricks workflows,
cost optimization, performance tuning.
. Familiarity with Unity Catalog, RBAC, or enterprise-level Databricks setups.
. Ability to create reusable components, templates, and documentation to standardize
data engineering workflows is a plus.
. Ability to define best practices, support multiple projects, and sometimes mentor junior
engineers is a plus.
. Must have experience of working with streaming data sources and Kafka (preferred)
Eligibility Criteria:
. Bachelor's degree in computer science, Data Engineering, or a related field
. Extensive experience with Databricks, Delta Lake, PySpark, and SQL
. Databricks certification (e.g., Certified Data Engineer Professional)
. Experience with machine learning and AI integration in Databricks
. Strong understanding of cloud platforms (AWS, Azure, or GCP)
. Proven leadership experience in managing technical teams
. Excellent problem-solving and communication skills
Location: Mumbai, Pune, Bangalore
Primary Skills: Azure databricks, Microsoft fabric, Python/Pyspark
Secondary Skills: COE exp/practice exp, RFP, presales
Notice period: Immediate to 60 days

Check Your Resume for Match

Upload your resume and our tool will compare it to the requirements for this job like recruiters do.

More Info

Job Type:
Employment Type:

About Company

Antal International is a global executive search organisation with over 130 offices in more than 30 countries. We have a network of over 800 people operating under the Antal brand, successfully placing talent for professional positions in over 75 countries around the world. We believe our value and uniqueness lie in our skill base and industry

Job ID: 135938411