
Search by job, company or skills
Design and develop automation scripts for:
API testing (REST/JSON)
ETL pipeline validation
Data transformation checks in Databricks
Build and maintain reusable automation components using Python, PySpark
Automate regression suites covering DBT model outputs, schema validations, and upstream/downstream data layers.
Implement automated data validation across:
Databricks Delta Lake
SQL Server sources
AWS S3 raw layers
Data Engineering Test Automation
. Automate validation of DBT transformations (tests, snapshots, seed data checks).
. Build SQL-based and script-based automation for:
o Data reconciliation
o Aggregation validation
o Schema evolution testing
o Data freshness checks
. Use Databricks APIs or automation tools to validate notebook runs and workflows.
API & Integration Automation
. Develop API automation scripts for:
o Data ingestion
o Data consumption
o Metadata services
. Use tools like Postman/Newman or Python requests for automated Web/API testing.
CI/CD Integration
. Integrate automation suites with:
o AWS CodePipeline / GitHub Actions / Jenkins / GitLab CI
. Configure pipelines to run tests automatically for every code push, DBT model change, or Databricks workflow update.
Agile Collaboration
. Participate actively in Agile ceremonies-stand-ups, sprint planning, grooming, retros.
. Work closely with Data Engineers, DBT Developers, Cloud Engineers, and Product Owners.
. Provide automation insights, effort estimates, and feasibility judgments.
Defect Management
. Log defects with clear data evidence in Jira.
. Collaborate with teams to identify root cause (pipeline logic, DBT model, AWS service failure, etc.).
. Maintain traceability between requirements test cases automated scripts.
Quality Improvement & Standards
. Enhance test coverage and reliability by contributing to:
o Automation strategy
o Data testing best practices
o Test data generation utilities
o Error handling and logging improvements
. Advocate for quality-first development in a data engineering environment.
Technical Skills:
. Strong experience in API and Automation Testing.
. Hands-on experience automating data validations for:
o Snowflake
o Databricks (SQL/PySpark)
o SQL Server
. Understanding of DBT (models, tests, documentation, lineage).
. Strong SQL skills (joins, CTEs, window functions, reconciliations).
. Experience with Python automation frameworks (pytest, unittest), or Java (TestNG, JUnit).
. Exposure to AWS data services:
o S3, Glue, Lambda, Step Functions, Athena, EMR (optional)
. CI/CD exposure (GitHub Actions, Jenkins, AWS CodePipeline).
. Experience with test management tools (Zephyr, Jira).
Perks and Benefits for Irisians
Iris provides world-class benefits for a personalized employee experience. These benefits are designed to support financial, health and well-being needs of Irisians for a holistic professional and personal growth. Click to view the benefits.
A strategic partner that transformational leaders can trust to realize the full potential of technology-enabled transformation.As a trusted technology partner, we focus our highly-experienced talent and rightsized teams to develop complex, mission-critical applications and solutions for leading enterprise across financial services, life sciences, including pharmaceutical, CROs and medical devices, manufacturing & logistics and educational services.
Job ID: 142349013