Search by job, company or skills

evnek

Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 2 months ago
  • Be among the first 30 applicants
Early Applicant

Job Description

This is a remote position.

Job Title: Data Engineer
Experience: 3–5 Years
Location: Remote
Notice Period: Immediate Joiners Only

Role Summary
We are seeking a skilled Data Engineer to design, develop, and maintain scalable and reliable data pipelines. The ideal candidate will have expertise in BigQuery, data ingestion techniques, orchestration tools, and a strong command over Python, FastAPI, and PostgreSQL. Experience in handling end-to-end data workflows is essential.

Key Responsibilities
  • Data Pipeline Development: Design and implement scalable data pipelines that can handle increasing data loads without compromising performance.
  • BigQuery Optimization: Write complex SQL transformations and optimize query performance using best practices such as partitioning, clustering, and efficient JOIN operations.
  • Data Ingestion: Develop robust data ingestion processes from various sources, including RESTful APIs and file-based systems, ensuring data integrity and consistency.
  • Workflow Orchestration: Utilize orchestration tools like Prefect or Apache Airflow to schedule and monitor data workflows, ensuring timely and reliable data processing.
  • Tech Stack Proficiency: Leverage Python and FastAPI for building data services and APIs, and manage data storage and retrieval using PostgreSQL.
  • End-to-End Workflow Management: Own the entire data workflow process, from ingestion and transformation to delivery, ensuring data quality and availability.

Key Responsibilities:

  • Data Pipeline Development: Design and implement scalable data pipelines that can handle increasing data loads without compromising performance.
  • BigQuery Optimization: Write complex SQL transformations and optimize query performance using best practices such as partitioning, clustering, and efficient JOIN operations.
  • Data Ingestion: Develop robust data ingestion processes from various sources, including RESTful APIs and file-based systems, ensuring data integrity and consistency.
  • Workflow Orchestration: Utilize orchestration tools like Prefect or Apache Airflow to schedule and monitor data workflows, ensuring timely and reliable data processing.
  • Tech Stack Proficiency: Leverage Python and FastAPI for building data services and APIs, and manage data storage and retrieval using PostgreSQL.
  • End-to-End Workflow Management: Own the entire data workflow process, from ingestion and transformation to delivery, ensuring data quality and availability.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 141113025

Similar Jobs