About the Company
We are hiring Sr. Data Engineers with Scala & Java Combination for Bangalore Location. If you are interested please share your resume at - [Confidential Information]
Responsibilities
- Exp in Scala, Java, Airflow, Spark on Kubernetes, Yarn, Oozie, Hadoop, Kafka, Spark & Spark Structured Streaming.
- Advanced Scala experience (e.g. Functional Programming, using Case classes, Complex Data Structures & Algorithms)
- Proficient in developing automated frameworks for unit & integration testing.
- Experience with Docker and Helm and related container technologies.
- Design & implement robust, scalable, batch & real-time data engineering solutions using Apache Spark (Scala) & Spark structure streaming.
- Architect well-structured Scala projects using reusable, modular, and testable codebases aligned with SOLID principles and clean architecture principles & practices.
- Develop, Deploy & Manage Spark jobs on Kubernetes clusters, ensuring efficient resource utilization, fault tolerance, and scalability.
- Orchestrate data workflows using Apache Airflow manage DAGs, task dependencies, retries, and SLA alerts.
- Write and maintain comprehensive unit tests and integration tests for Pipelines / Utilities developed.
- Work on performance tuning, partitioning strategies, and data quality validation.
- Use and enforce version control best practices (branching, PRs, code review) and continuous integration (CI/CD) for automated testing and deployment.
- Write clear, maintainable documentation (README, inline docs, docstrings).
- Participate in design reviews and provide technical guidance to peers and junior engineers.
Required Skills
Scala, Java, Airflow, Spark, Kubernetes, Docker, Helm, Hadoop, Kafka, Spark Structured Streaming.