Search by job, company or skills

evnek

Senior Data Engineer

5-8 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 19 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Title: Senior Data Engineer

Experience: 5–8 Years

Location: Pune
Notice Period: Immediate Joiners Only
About the Role

We are seeking a highly skilled Senior Data Engineer to design, build, and optimize scalable data pipelines and cloud-based data infrastructure.

The ideal candidate will have strong expertise in AWS, Snowflake, Terraform, and programming skills in SQL, Python, and PySpark.

You will work closely with the Data Tech Lead and Lead Data Engineers to ensure high-quality, reliable, and efficient data workflows that power analytics, machine learning, and business intelligence initiatives.

Key Responsibilities

  • Develop reusable frameworks using cloud technologies like AWS, Snowflake, and Managed Airflow
  • Design and implement scalable ETL/ELT pipelines using Python, SQL, and PySpark
  • Build and manage infrastructure using Terraform (IaC)
  • Develop and maintain data models, transformations, and orchestration workflows
  • Ensure data quality, observability, and lineage tracking across systems
  • Optimize query performance, storage, and compute costs in Snowflake and AWS
  • Implement and maintain CI/CD pipelines for data infrastructure
  • Monitor, troubleshoot, and maintain SLAs for data pipelines and cloud systems

Required Skills & Qualifications

  • 5–8 years of experience in Data Engineering
  • Strong proficiency in SQL and Python
  • Hands-on experience with AWS (S3, Glue, Lambda, Redshift, etc.)
  • Expertise in Snowflake (performance tuning, schema design, Snowflake SQL)
  • Strong experience with Terraform for infrastructure automation
  • Experience with Airflow or other orchestration tools
  • Knowledge of data observability, monitoring, and governance
  • Experience with Git and CI/CD pipelines
  • Strong problem-solving and analytical skills
  • Excellent communication and collaboration abilities

Good to Have

  • Experience with Data Mesh architecture
  • Exposure to Docker, Kafka, or Kinesis
  • Knowledge of data security and compliance (GDPR, SOC2)
  • Experience in cost optimization and performance tuning
  • Strong hands-on experience with PySpark

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145807939

Similar Jobs