Search by job, company or skills

Algoscale

Lead Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 22 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Exp: 6+ Years

Location: Noida (WFO)

Timings: Mon-Fri; 10:30 AM - 7:30 PM

About The Role

We are looking for an experienced Lead Data Engineer with strong technical expertise and proven leadership capabilities. The ideal candidate has 6+ years of experience in building large-scale data systems, is proficient in Python, SQL, PySpark, and Databricks, and has hands-on experience working with AWS or Azure cloud environments. This role involves leading a team of data engineers while driving architecture, best practices, and scalable data solutions.

Key Responsibilities

  • Lead and mentor a team of data engineers to deliver end-to-end data solutions.
  • Design, develop, and maintain ETL/ELT pipelines for ingestion, transformation, and analytics.
  • Architect and manage scalable data lake and data warehouse environments.
  • Build and optimize distributed data processing workflows using PySpark and Databricks.
  • Collaborate with analytics, product, and data science teams to understand requirements.
  • Define and implement best practices for coding standards, CI/CD, and data governance.
  • Establish data quality checks and monitoring frameworks to ensure reliability.
  • Troubleshoot performance bottlenecks and provide technical leadership across projects.
  • Evaluate new tools and technologies to strengthen the organization's data capabilities.

Required Skills & Experience

  • 6+ years of professional experience in data engineering.
  • Strong skills in Python, SQL, PySpark, and Databricks.
  • Hands-on experience with cloud platforms such as AWS or Azure.
  • Proven experience leading or mentoring a data engineering team.
  • Strong understanding of distributed computing principles.
  • Experience in building scalable ETL/ELT pipelines.
  • Knowledge of CI/CD processes and version control using Git.
  • Experience with data modeling and data warehousing concepts.

Preferred Qualifications

  • Certifications from Databricks, Snowflake, AWS, or Azure.
  • Experience with orchestration tools such as Airflow, ADF, or Prefect.
  • Familiarity with delta architecture and modern data stack tools.
  • Experience working in Agile environments.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 134385431

Similar Jobs