We are looking for a highly skilled
Technology Engineer with strong expertise in
streaming data platforms and cloud-native technologies. The ideal candidate will have hands-on experience with
Kafka, CI/CD pipelines, and cloud environments, and will play a critical role in building scalable, event-driven data systems.
Requirements
Key Responsibilities
- Design, implement, and manage high-throughput Kafka clusters in production environments
- Build and maintain event-driven and streaming data architectures
- Develop and manage CI/CD pipelines using tools like Jenkins, ArgoCD, and GitOps practices
- Work on cloud-native applications across AWS, Azure, or OCI
- Automate infrastructure and deployment processes using modern cloud automation tools
- Collaborate with cross-functional teams to deliver scalable and resilient data solutions
- Ensure best practices in code quality, testing, and deployment automation
- Implement design patterns for scalable and maintainable systems
- Perform unit testing and integration testing for robust application delivery
Required Skills & Experience
- Strong experience in Apache Kafka setup, configuration, and performance tuning in high-volume environments
- Proficiency in Git (source control) and CI/CD tools like Jenkins, ArgoCD
- Hands-on experience with public cloud platforms (AWS / Azure / OCI)
- Experience working with large-scale streaming datasets
- Good understanding of event-driven architecture and microservices
- Exposure to GitOps practices and DevOps culture
- Experience with cloud automation tools
- Strong knowledge of software design patterns and system design principles
- Experience in automated testing (unit & integration testing)
Preferred Qualifications
- Experience with OpenShift / Kubernetes platforms
- Familiarity with real-time streaming frameworks and data pipelines
- Prior experience in banking or financial services domain (preferred but not mandatory)