Search by job, company or skills

Incedo Inc.

SDET - Kafka

Save
new job description bg glownew job description bg glow
  • Posted 5 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Kafka Testing | ETL PySpark

Key Responsibilities

Kafka Testing

Design and execute automated test scripts for Kafka producers, consumers, and topics. Validate message delivery, ordering, and data integrity in streaming pipelines. Test consumer group management, offset tracking, and error handling scenarios.

ETL & PySpark Validation

Test PySpark ETL jobs integrated with Kafka topics. Validate data transformations, schema compliance, and SLA adherence. Perform end-to-end data validation across ingestion → transformation → consumption workflows.

Test Automation

Build robust automation frameworks using Python, Java, or relevant tools. Integrate tests into CI/CD pipelines (Jenkins, GitLab, etc.). Develop reusable test utilities for Kafka payload validation, schema checking, and data quality.

Quality Assurance

Conduct smoke, regression, integration, and performance testing. Identify, document, and track defects with clear reproduction steps. Collaborate with Data Engineers and Platform teams to define acceptance criteria.

Shift

1:00 PM - 10:00 PM IST (1-10 PM)

Work Mode

4 days WFO (Work From Office) + 1 day WFH (Work From Home)

Location

Hyderabad

Work Days

Mon-Fri (flexibility on WFH day)

What We Offer

• Competitive salary package (commensurate with experience) • Flexible work arrangement (4 days onsite, 1 day remote) • Health insurance & wellness benefits • Professional development opportunities • Exposure to cutting-edge data streaming architecture • Collaborative, agile team environment • Career growth in high-demand data engineering ecosystem

Job Type: Full-Time | Experience: 5-8 Years | Shift: 1 PM - 10 PM IST

Eligible: Kafka + PySpark + QA Automation Professionals

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147475979