Job Summary
We are hiring Big Data Architect with 10+ years of expertise for one of our GCC clients. The candidate should have deep knowledge of Hadoop, Spark, NoSQL, cloud platforms (AWS/Azure/GCP), ETL, and data modeling, with strong leadership skills to mentor teams and collaborate with business stakeholders.
Job Roles & Responsibilities
- Design scalable batch & streaming data architectures
- Develop efficient data models and storage strategies
- Ensure data security, integrity, and compliance
- Integrate data from diverse sources, including legacy systems
- Optimize performance of data infrastructure
- Collaborate with cross-functional teams to deliver data solutions
- Evaluate and adopt emerging technologies
- Mentor junior team members
- Solve complex data engineering challenges
- Contribute to sales and client engagement processes
Cultural Expectations
- 10+ years in data engineering, 2+ years as Big Data Architect
- Proficient in Hadoop, Spark, NoSQL, cloud data services (AWS/Azure/GCP)
- Strong in Python, Java, Scala
- Expertise in SQL/NoSQL and data modeling
- Hands-on with ETL design and implementation
- Knowledge of data security, governance, and authentication (LDAP, SAML, etc.)
- Experience with DevOps tools and automation
- Excellent problem-solving, communication & collaboration skill
About Kodiva:
Kodiva.ai leverages artificial intelligence to unify skill intelligence, data, and automation processes. Our mission is to revolutionise the hiring process by enabling organisations to make smarter recruitment decisions while helping candidates achieve better career opportunities. By integrating cutting-edge technology, we create efficient and innovative solutions that benefit both employers and job seekers.