Search by job, company or skills

Maruti Techlabs

Senior Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

We are looking for a Lead Data Engineer to drive the development of modern data platform. This role will focus on building scalable and reliable data pipelines using tools like DBT, Snowflake, and Apache Airflow, and will play a key part in shaping data architecture and strategy.

As a technical leader, you'll work closely with cross-functional teams including analytics, product, and engineering to deliver clean, accessible, and trustworthy data for business decision-making and machine learning use cases.

Key Responsibilities:

  • Lead the design and implementation of ELT pipelines using DBT and orchestrate workflows via Apache Airflow.
  • Design, implement, and maintain robust data models to support analytics and reporting.
  • Architect and optimize our cloud data warehouse in Snowflake, ensuring performance, scalability, and cost efficiency.
  • Collaborate with data analysts and stakeholders to model and deliver well-documented, production-grade datasets.
  • Establish data engineering best practices around version control, testing, CI/CD, and observability.
  • Build and maintain data quality checks and data validation frameworks.
  • Mentor junior data engineers and foster a strong engineering culture within the team.
  • Collaborate on data governance efforts, including metadata management, data lineage, and access controls.
  • Evaluate and integrate new tools and technologies to evolve our data stack.

Requirements:

  • 8+ years of experience in data engineering with at least 2 years in a lead role.
  • Data modelling experience is must
  • Strong experience designing and managing data pipelines with DBT and Airflow.
  • Proven expertise in data modelling techniques (dimensional modelling, star/snowflake schemas, normalization, denormalization) and translating business requirements into scalable data models.
  • Deep understanding of Snowflake, including performance tuning and cost optimization.
  • Strong SQL and Python skills for data transformation and automation.
  • Experience with Git-based workflows and CI/CD for data pipelines.
  • Excellent communication skills and experience working with cross-functional teams.
  • Experience with data cataloguing and lineage tools
  • Exposure to event-driven architectures and real-time data processing.
  • Understanding of data privacy and security standards

Education:

  • Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 135811353

Similar Jobs