Search by job, company or skills

A

Data Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 18 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Project Role : Data Engineer

Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.

Must have skills : Data Building Tool

Good to have skills : NA

Minimum 3 Year(s) Of Experience Is Required

Educational Qualification : 15 years full time education

Job Title

Senior Data Engineer – DBT Experience

5+ years in Data Engineering / Analytics Engineering

Summary:

We are looking for a Senior Data Engineer with strong hands-on experience in Snowflake and DBT to design, build, and maintain scalable data transformation pipelines. The ideal candidate will work closely with analytics, business, and platform teams to deliver reliable, high quality data models following best practices.

Key Responsibilities

Design, develop, and maintain Snowflake data models, schemas, and views

Build and manage DBT models, tests, snapshots, and documentation

Implement ELT pipelines following analytics engineering best practices

Optimize Snowflake performance (warehouses, clustering, query tuning)

Ensure data quality using DBT tests and reconciliation checks

Orchestrate data workflows using Airflow (DAG monitoring and troubleshooting)

Manage source control, CI/CD pipelines, and code reviews using GitHub and GITHUB Actions

Work with cloud storage (AWS S3) for data ingestion and unloading

Collaborate with cross functional teams for requirements, design, and delivery

Support production issues, RCA, and continuous improvement initiatives

Required Skills

Strong experience with Snowflake (SQL, Warehouses, Security, Performance tuning)

Hands on experience with DBT (Core)

Advanced SQL skills for analytical transformations

Experience with ELT/ETL pipelines and data warehousing concepts

Familiarity with Airflow for job orchestration

Experience using GitHub for version control and CI/CD

Strong debugging, problem solving, and communication skills

Good to Have

Basic knowledge of AWS S3 (file ingestion, unloads)

Basic understanding of AWS IAM roles and policies

Exposure to GitHub Actions or CI/CD automation

Experience in production support (L2/L3), monitoring, and incident handling

Knowledge of data governance, access control, and security best practices

Additional Information:

  • The candidate should have minimum 5 years of experience in Data Building Tool.
  • This position is based at our Chennai office.
  • A 15 years full time education is required.

, 15 years full time education

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147194007

Similar Jobs

Chennai, India

Skills:

Data ModellingSnaplogicIbm DatastageData WarehousingSSISTalendSqlSnowflake Cloud DatabaseStored ProceduresData pipelines

Chennai, India

Skills:

distributed architecture object storage KshLINUXSqlELTPythonEtlAirflowCFTStarburstTrino

Chennai, India

Skills:

PysparkDatabase DesignData Governancedata modeling conceptsdata quality best practicesETL tools and processes

Chennai, India

Skills:

Data ModelingEmrRedshiftQuicksightSparkPythonAWSEtlReal-time Data PipelinesIcebergLake FormationGlueAthena

Chennai, India

Skills:

snowflake Apache AirflowArgoOozieAzurePythonAWSdbtData Vault methodology