A technology company operating in the Data Engineering & Big Data Analytics sector, delivering scalable data platforms, real-time streaming pipelines, and enterprise-grade analytics solutions for clients across finance, retail, and telecom. We build production data products that enable data-driven decisions and automate large-scale ETL/ELT workflows.
Primary Title: Big Data Engineer
Location: Pune, India
Role & Responsibilities
- Design, develop, and optimize Big Data pipelines.
- Work on large-scale distributed data processing systems.
- Build and manage workflows using Airflow.
- Develop scalable data solutions on GCP.
- Write efficient SQL queries and data transformation scripts.
- Collaborate with cross-functional teams for data integration and migration projects.
- Troubleshoot and optimize performance of data pipelines.
Skills & Qualifications- Must-Have
- Strong experience in Hive, HDFS, HBase
- Good knowledge of Big Data platform
- Hands-on experience with Scala
- Strong proficiency in Python
- Experience working on GCP (Google Cloud Platform)
- Strong command over SQL
- Hands-on experience with Apache Airflow
- Minimum 4 years of relevant experience
- Preferred
- Unix / Linux environment
- Shell scripting
- Experience with any ETL tool
Benefits & Culture Highlights
- Hands-on ownership of end-to-end data systems and visibility into business impact.
- Support for technical growthtraining and certification sponsorships.
- Collaborative, engineering-first culture with opportunities to scale solutions across enterprise customers.
To apply: bring strong Big Data domain expertise, a bias for measurable outcomes, and readiness to contribute on-site in India to accelerate data platform delivery.
Skills: python,airflow,hadoop,big data