Search by job, company or skills

  • Posted 5 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Experience Required-3-7years

What You'll Do

Design, build, and maintain scalable data pipelines and ETL/ELT workflows to ingest, transform, and process large volumes of structured and semi-structured data.

Develop and optimize data models, tables, and transformations to support analytics, reporting, and downstream data consumption.

Work with large datasets using SQL, PySpark, and modern data platforms such as Snowflake and Databricks to ensure efficient data processing.

Build and manage data workflows using orchestration tools such as Apache Airflow, ensuring reliable and timely data delivery.

Develop automation scripts using Shell Scripting and Python to support data pipeline execution, monitoring, and operational efficiency.

Monitor, troubleshoot, and optimize data pipelines to improve performance, scalability, and reliability across the data ecosystem.

Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and enable data-driven insights

Ensure adherence to data engineering best practices, including data quality checks, documentation, and pipeline governance.

What We're Looking For

Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or a related field.

38 years of experience in data engineering or data platform development within large-scale data environments.

Strong proficiency in SQL, Python, and distributed data processing frameworks such as PySpark.

Hands-on experience with modern data platforms such as Snowflake and Databricks.

Experience building and managing workflow orchestration pipelines using Apache Airflow.

Exposure to Shell Scripting for automation and operational workflow management.

Strong understanding of data modeling, ETL/ELT processes, and scalable data pipeline architecture.

Ability to collaborate effectively with cross-functional teams and global stakeholders.

Nice to Have

Experience working with cloud platforms such as AWS, GCP, or Azure.

Familiarity with data lakehouse architectures, data governance, and modern data platform practices.

Proven ability to work with global stakeholders in cross-functional, matrixed environments.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 144561105

Similar Jobs