
Search by job, company or skills
About Company :
Our client is a trusted global innovator of IT and business services. They help clients transform through consulting, industry solutions, business process services, digital & IT modernization and managed services. Our client enables them, as well as society, to move confidently into the digital future. We are committed to our clients long-term success and combine global reach with local client attention to serve them in over 50 countries around the globe.
· Job Title: ETL Test Engineer / Data Quality Engineer
· Location: Gurgaon / Bangalore
· Experience: 6+ yrs
· Job Type : Contract to hire.
· Notice Period:- Immediate joiners.
Mandatory Skills:
Job Overview
We are looking for an experienced ETL Test Engineer / Data Quality Engineer to ensure the quality, accuracy, and reliability of large-scale data pipelines.
The ideal candidate will have strong expertise in ETL testing, data validation, and automation, along with hands-on experience in the Google Cloud Platform (GCP) ecosystem.
Primary Skills (Must Have)
5–10 years of experience in ETL Testing / Data Quality Engineering / Data Testing
Strong hands-on experience with GCP:
BigQuery
DataProc (Spark/PySpark)
Cloud Composer (Airflow)
Proficiency in Python and PySpark
Advanced SQL skills for large-scale data validation (BigQuery preferred)
Hands-on experience in ETL pipeline testing and validation
Strong understanding of Software Testing Life Cycle (STLC)
Experience in building or working with test automation frameworks
Experience in test planning and strategy design
Secondary Skills (Good to Have)
Knowledge of data warehousing concepts and data modeling
Exposure to CI/CD pipelines in data engineering/testing
Knowledge of GCP architecture
Key Responsibilities:
Design, develop, and execute test strategies for ETL/data pipelines
Perform end-to-end data validation across multiple systems
Validate large datasets using SQL (BigQuery) and automation frameworks
Develop and maintain test automation frameworks using Python and PySpark
Analyze and validate ETL pipelines built on GCP (BigQuery, DataProc, Cloud Composer/Airflow)
Collaborate with data engineers and stakeholders to ensure data accuracy
Identify, log, and track defects using tools like Jira
Participate in Agile ceremonies (sprint planning, stand-ups, retrospectives)
Contribute to test planning, strategy design, and execution
Skills
Job ID: 147139479