Job Description
Key Responsibilities Develop and maintain ETL/ELT pipelines using DBT and Snowflake Implement data models and transformations based on defined architecture Write efficient and optimized SQL queries Support data validation, quality checks, and troubleshooting Work with senior team members on architecture and design implementation Contribute to CI/CD pipeline setup and enhancements Collaborate with Europe team for delivery alignment Ensure proper documentation and code quality standards
Mandatory Skills Hands-on experience with: Snowflake, DBT, SQL (strong) Good understanding of data warehousing concepts Experience in ETL/ELT development Exposure to CI/CD tools and version control (Git) Strong analytical and problem-solving skills
Preferred Skills Exposure to cloud platforms (AWS/Azure/GCP) Basic understanding of data architecture principles Familiarity with Airflow or orchestration tools Experience with performance tuning
Key Expectations Strong hands-on coding and delivery focus Ability to work in offshore/onshore model Willingness to learn and grow into architect/lead roles
Data Engineering with Snowflake, AWS, dbt, Python and SQL Snowflake, dbt, AWS, SQL, Python, CI/CD, DWH