Search by job, company or skills

I

Data Engineer - Python, PySpark, Airflow

new job description bg glownew job description bg glownew job description bg svg
  • Posted 14 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Role: Data Engineer - Python, PySpark, Airflow

Experience: 3+ Years

Location: Bangalore | Hyderabad

Immediate Joiners Only

Key Responsibilities

  • Develop and enhance new product features with a strong focus on scalability and performance.
  • Modernize existing system components by redesigning them to align with new architecture paradigms.
  • Deeply understand the business domain, customer needs, and core use cases.
  • Own and deliver complex engineering tasks end-to-end.
  • Ensure adherence to non-functional requirements (stability, scalability, performance).
  • Mentor junior engineers and influence technical decision-making.
  • Work within a Scrum team and collaborate with cross-functional groups to solve challenging technical problems.

Requirements

  • Minimum 5 - 7 years of hands-on experience with Python/Spark/Airflow.
  • Strong SQL skills with experience working on relational and NoSQL databases (Snowflake is a plus).
  • Strong understanding of data models, ETL/ELT pipelines, and batch/real-time processing.
  • Practical experience in Scala or Java.
  • Experience with Docker, Kubernetes, and containerized environments.
  • Solid experience building solutions on cloud platforms.
  • Strong understanding of DevOps methodologies.
  • Experience working in an Agile/Scrum environment.
  • Passion for modern data engineering technologies and continuous learning.
  • Ability to quickly dive into development activities and deliver high-quality code.
  • English proficiency: B2 or above.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 132928061

Similar Jobs