Search by job, company or skills

Zorba AI

Big data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 2 days ago
  • Be among the first 20 applicants
Early Applicant

Job Description

Role Overview

We are looking for two Junior-level Big Data Engineers with strong hands-on experience in AWS, Python, Apache Airflow, and Big Data technologies. The ideal candidates will design, build, and maintain scalable data pipelines and workflows in a cloud-based environment, ensuring data reliability, performance, and security.

Key Responsibilities

  • Design, develop, and maintain ETL data pipelines using Apache Airflow.
  • Build scalable data solutions on AWS, leveraging services such as:
    • Amazon S3
    • EC2
    • AWS Lambda
    • EMR
    • AWS Glue
    • Amazon Redshift
  • Write clean, efficient, and maintainable Python code for data processing and automation.
  • Work with Big Data frameworks such as Spark and Hadoop for large-scale data processing.
  • Optimize workflows for performance, scalability, and cost efficiency.
  • Collaborate with cross-functional teams to gather data requirements and deliver solutions.
  • Ensure data quality, reliability, and security across pipelines and processes.
  • Troubleshoot pipeline failures and resolve performance bottlenecks.
Required Qualifications

  • 35 years of hands-on experience in Data Engineering roles.
  • Strong experience with AWS cloud services for data engineering workloads.
  • Proficiency in Python, including libraries such as Pandas and PySpark.
  • Hands-on experience with Apache Airflow, including DAG creation and monitoring.
  • Familiarity with Big Data technologies (Spark, Hadoop, or similar).
  • Solid understanding of ETL processes and data modeling concepts.
  • Strong analytical, problem-solving, and debugging skills.
  • Excellent communication and collaboration abilities.

Preferred Qualifications

  • Experience with CI/CD pipelines and DevOps practices.
  • Knowledge of SQL and relational databases.
  • Familiarity with containerization technologies such as Docker and Kubernetes.
  • Exposure to monitoring and logging tools in cloud environments

Skills: python,airflow,aws,big data

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 138145919

Similar Jobs

Early Applicant