Search by job, company or skills

Agam Technologies

ETL Data Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 18 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job description

Experience: 2 to 4 Years

Location: Coimbatore- Work from Office

Employment Type: Full-time

Availability: Immediate Joiner Preferred

Job Summary

We are looking for a motivated ETL Data Engineer with 2–3 years of experience in building and maintaining data pipelines. The ideal candidate should have hands-on experience with Snowflake and strong expertise in ETL/ELT processes, data warehousing, and SQL. You will play a key role in transforming raw data into reliable, scalable, and high-quality datasets for analytics and business intelligence.

Key Responsibilities

•⁠ ⁠Design, develop, and maintain ETL/ELT pipelines using Snowflake 

•⁠ ⁠Design, develop, and maintain modular dbt models (SQL & Python) using best practices like DRY (Don't Repeat Yourself) and version control.

•⁠ ⁠Extract data from multiple sources (databases, APIs, flat files, cloud systems) 

•⁠ ⁠Transform and load data into Snowflake data warehouse efficiently 

•⁠ ⁠Write optimized SQL queries for data processing and transformation 

•⁠ ⁠Develop and maintain data models (star schema, snowflake schema) 

•⁠ ⁠Ensure data quality, integrity, and consistency across pipelines 

•⁠ ⁠Monitor, troubleshoot, and optimize ETL workflows and job performance 

•⁠ ⁠Collaborate with data analysts, BI developers, and business stakeholders 

•⁠ ⁠Implement data validation, error handling, and logging mechanisms 

•⁠ ⁠Maintain documentation for data pipelines, workflows, and architecture 

Required Skills & Qualifications

•⁠ ⁠2–3 years of experience in ETL/Data Engineering 

•⁠ Hands-on experience with dbt (Core or Cloud), including macros, packages, and hooks.

•⁠ ⁠Hands-on experience with Snowflake (tables, stages, warehouses, data loading) 

•⁠ ⁠Strong proficiency in SQL (joins, window functions, CTEs, performance tuning) 

•⁠ ⁠Experience in ETL/ELT tools (e.g., Azure Data Factory, Informatica, Talend, etc.) 

•⁠ ⁠Good understanding of data warehousing concepts 

•⁠ ⁠Knowledge of data modeling techniques (dimensional modeling) 

•⁠ ⁠Familiarity with file formats (CSV, JSON, Parquet) 

•⁠ ⁠Experience handling large datasets and optimizing performance 

•⁠ ⁠Basic scripting knowledge (Python or Shell scripting is a plus) 

•⁠ ⁠Strong analytical and problem-solving skills 

Preferred Skills

•⁠ ⁠Experience with cloud platforms (Azure/AWS/GCP) 

•⁠ ⁠Knowledge of Snowflake features like Time Travel, Cloning, and Data Sharing 

•⁠ ⁠Familiarity with orchestration tools (Airflow, Prefect, etc.) 

•⁠ ⁠Experience with dbt (Data Build Tool) 

•⁠ ⁠Exposure to CI/CD pipelines and version control (Git) 

•⁠ ⁠Understanding of Agile methodologies (Scrum/Kanban) 

Education

•⁠ ⁠Bachelor's degree in Computer Science, Information Technology, or related field 

Nice to Have

  • ⁠Snowflake certification (SnowPro Core or equivalent)

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147192889