About Birlasoft:
Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company's consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group's 170-year heritage of building sustainable communities.
Job Title – Lead Data Engineer / ETL Lead
Experience – 6-10 Years
Location – Any BSL
Education – B.E/B.Tech
Job Summary
We are seeking a highly experienced Lead Data Engineer to design, build, and lead large-scale data integration, migration, and analytics platforms. This role requires deep expertise in enterprise ETL tools, modern cloud-native data architectures, and real-time data processing, along with strong leadership and delivery management skills.
Key Responsibilities
- Lead the design, development, and optimization of end-to-end data pipelines across batch and real-time processing
- Architect and implement enterprise-grade ETL solutions using Informatica, Ab Initio, and cloud-native services
- Drive large-scale data migration and conversion initiatives, including mock runs, reconciliation, validation, and production cutovers
- Design and manage cloud-based data platforms, leveraging Snowflake and AWS analytics services
- Build and optimize PySpark-based data processing frameworks for high-volume datasets
- Implement real-time data ingestion and transformation pipelines using Kafka and streaming technologies
- Own performance tuning, scalability, cost optimization, and SLA adherence for data workloads
- Collaborate closely with business, functional, and architecture teams to translate requirements into robust technical solutions
- Lead and mentor development teams, conducting code reviews and enforcing engineering best practices
- Oversee CI/CD, scheduling, monitoring, and operational stability of data pipelines
- Support Agile delivery by participating in planning, backlog grooming, execution, and retrospectives
Required Skills & Qualifications
Core Data Engineering
- Strong hands-on experience with Ab Initio and Informatica PowerCenter / IICS
- Advanced expertise in data warehousing concepts, dimensional modeling, and ETL design patterns
- Proven experience with Snowflake data platform capabilities
Cloud & Big Data
- Extensive hands-on experience with AWS data services (Glue, Redshift, Athena, EMR, S3, CI/CD pipelines)
- Proficiency in PySpark, Spark SQL, Hive, and Hadoop ecosystems
- Experience with Kafka-based streaming architectures and KSQL
Programming & Databases
- Advanced SQL and PL/SQL
- Strong Python and UNIX shell scripting skills for automation
- Experience optimizing large-scale database and ETL workloads
Delivery & Leadership
- Experience leading multi‑team, offshore‑onsite delivery models
- Strong Agile/Scrum execution experience
- Excellent communication, stakeholder management, and problem-solving skills
Preferred Certifications
- Cloud and database-related professional certifications
- ETL or data platform certifications