Job Title
Data Lead – Azure Data Factory (ADF), Databricks
Experience - 7 to 9
Location
Pune, India
Client Industry
Information Technology / Multinational Technology Solutions
Overview of the Role
As a Data Lead specializing in Azure Data Factory (ADF) and Databricks, you will play a pivotal role in the design, implementation, and optimization of enterprise data solutions. You will lead technical initiatives to ensure seamless data integration, high-quality data pipelines, and advanced analytics capabilities, directly supporting strategic objectives and digital transformation for HCLTech's clients. This is a highly visible role, offering the opportunity to shape the data landscape within a fast-paced, innovative environment.
Detailed Responsibilities
- Lead the end-to-end design, development, and deployment of data solutions utilizing Snowflake, Azure Data Factory (ADF), and Databricks.
- Collaborate with cross-functional teams to gather business and technical requirements, architect scalable data pipelines, and ensure data quality and integrity.
- Provide hands-on technical leadership, guidance, and mentorship to team members, promoting best practices in data engineering and platform utilization.
- Troubleshoot and resolve complex issues related to data pipelines, transformation logic, and data storage across Snowflake, ADF, and Databricks environments.
- Stay abreast of industry trends, emerging technologies, and best practices; proactively recommend and implement enhancements to existing data processes and architectures.
- Ensure adherence to project timelines, deliverables, and quality standards, while managing stakeholder expectations effectively.
- Drive process improvements and automation initiatives to increase operational efficiency and reliability.
- Maintain comprehensive documentation and participate in code reviews to ensure maintainability and knowledge sharing.
Skill Requirements
Technical Skills & Experience
- Proven expertise in Snowflake architecture, implementation, and performance optimization.
- Hands-on experience with Azure Data Factory (ADF) for data pipeline development, integration, and monitoring.
- Strong proficiency with Azure Databricks for big data processing, data engineering workflows, and machine learning solutions.
- In-depth understanding of data warehousing concepts, ETL processes, and data modeling techniques.
- Strong analytical and problem-solving abilities, with a track record of delivering solutions in dynamic, fast-paced environments.
- Excellent communication and leadership skills, with experience collaborating across technical and non-technical teams.
Qualifications
- Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related field.
- Relevant certifications in Azure, Snowflake, or Databricks (preferred).
Other Requirements
- Ability to work from office 5 days a week for the initial 6 months (Office Location: WTC, Kharadi, Pune).
- Experience working in a multinational or large-scale enterprise environment is a plus.
- Exposure to Agile methodologies and DevOps practices is advantageous.