We are seeking a Big Data Developer (Internally known as Analyst - Big Data, Analytics and ML) with experience in Python, Java, Cloud-based data engineering, and modern Big Data ecosystems. The ideal candidate will have a background in building batch and real-time integration frameworks, optimizing large-scale distributed systems, and working with cloud-native services such as Cloud Run Jobs, Cloud Composer (Apache Airflow), BigQuery.
Job Responsibilities:
Big Data Skills:
- 1+ years of hands-on experience in implementing batch and real-time Big Data integration frameworks and/or applications in private or public cloud environments, preferably GCP, using technologies such as Python, Cloud Run Jobs, and Cloud Composer (Apache Airflow); including debugging, performance tuning, and bottleneck analysis
- 1+ years of experience working in a Linux environment, with the ability to interface with the OS using system tools, scripting languages, and integration frameworks
- 1+ years of hands-on experience in one or more modern object-oriented programming languages (Java, Scala, Python), with the ability to code in more than one language
- Hands-on experience applying schema design principles, best practices, and trade-offs across Big Data technologies, including designing Big Lakehouse architectures and five-layer data architectures
Preferred Qualifications:
- Experience working in agile, distributed teams.
- Familiarity with cloud-native CI/CD, DevOps practices.
- Strong problem-solving, analytical thinking, and communication skills.
Education:
- Graduate Engineer or equivalent qualification with minimum 1-3 years of successful experience in recognized, global IT services / consulting company