Software Requirements:
- Proficient in Java (version 1.8 or higher), with a solid understanding of object-oriented programming and design patterns.
- Experience with Big Data technologies including Hadoop, Spark, Hive, HBase, and Kafka.
- Strong knowledge of SQL and NoSQL databases with experience in Oracle preferred.
- Familiarity with data processing frameworks and standards, such as JSON, Avro, and Parquet.
- Proficiency in Linux shell scripting and basic Unix OS knowledge.
- Experience with code versioning tools, such as Git, and project management tools like JIRA.
- Familiarity with CI/CD tools such as Jenkins or Team City, and build tools like Maven.
Overall Responsibilities:
- Translate application storyboards and use cases into functional applications while ensuring high performance and responsiveness.
- Design, build, and maintain efficient, reusable, and reliable Java code.
- Develop high-performance and low-latency components to run Spark clusters and support Big Data platforms.
- Identify and resolve bottlenecks and bugs, proposing best practices and standards.
- Collaborate with global teams to ensure project alignment and successful execution.
- Perform testing of software prototypes and facilitate their transfer to operational teams.
- Conduct analysis of large data sets to derive actionable insights and contribute to advanced analytical model building.
- Mentor junior developers and assist in design solution strategies.
Category-wise Technical Skills:
Core Development Skills:
- Strong Core Java and multithreading experience.
- Knowledge of concurrency patterns and scalable application design principles.
Big Data Technologies:
- Proficiency in Hadoop ecosystem components (HDFS, Hive, HBase, Apache Spark).
- Experience in building self-service platform-agnostic data access APIs.
Analytical Skills:
- Demonstrated ability to analyze large data sets and derive insights.
- Strong systems analysis, design, and architecture fundamentals.
Testing and CI/CD:
- Experience in unit testing and SDLC activities.
- Familiarity with Agile/Scrum methodologies for project management.
Experience:
- 5 to 9 years of experience in software development, with a strong emphasis on Java and Big Data technologies.
- Proven experience in performance tuning and troubleshooting applications in a Big Data environment.
- Experience working in a collaborative global team setting.
Day-to-Day Activity:
- Collaborate with cross-functional teams to understand and translate functional requirements into technical designs.
- Write and maintain high-quality, performant code while enforcing coding standards.
- Conduct regular code reviews and provide constructive feedback to team members.
- Monitor application performance and address issues proactively.
- Engage in daily stand-ups and sprint planning sessions to ensure alignment with team goals.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Information Technology, or related field.
- Relevant certifications in Big Data technologies or Java development are a plus.
Soft Skills:
- Strong analytical and problem-solving skills.
- Excellent communication and collaboration skills, with the ability to work effectively in a team.
- Ability to mentor others and share knowledge within the team.
- Strong organizational skills and attention to detail, with the ability to manage multiple priorities.