Search by job, company or skills

Evnek

Data Engineer

Early Applicant
  • Posted a month ago
  • Be among the first 10 applicants

Job Description

This is a remote position.

Job Title: Data Engineer
Experience: 35 Years
Location: Remote
Notice Period: Immediate Joiners Only

Role Summary
We are seeking a skilled Data Engineer to design, develop, and maintain scalable and reliable data pipelines. The ideal candidate will have expertise in BigQuery, data ingestion techniques, orchestration tools, and a strong command over Python, FastAPI, and PostgreSQL. Experience in handling end-to-end data workflows is essential.

Key Responsibilities
  • Data Pipeline Development: Design and implement scalable data pipelines that can handle increasing data loads without compromising performance.
  • BigQuery Optimization: Write complex SQL transformations and optimize query performance using best practices such as partitioning, clustering, and efficient JOIN operations.
  • Data Ingestion: Develop robust data ingestion processes from various sources, including RESTful APIs and file-based systems, ensuring data integrity and consistency.
  • Workflow Orchestration: Utilize orchestration tools like Prefect or Apache Airflow to schedule and monitor data workflows, ensuring timely and reliable data processing.
  • Tech Stack Proficiency: Leverage Python and FastAPI for building data services and APIs, and manage data storage and retrieval using PostgreSQL.
  • End-to-End Workflow Management: Own the entire data workflow process, from ingestion and transformation to delivery, ensuring data quality and availability.

Key Responsibilities:

  • Data Pipeline Development: Design and implement scalable data pipelines that can handle increasing data loads without compromising performance.
  • BigQuery Optimization: Write complex SQL transformations and optimize query performance using best practices such as partitioning, clustering, and efficient JOIN operations.
  • Data Ingestion: Develop robust data ingestion processes from various sources, including RESTful APIs and file-based systems, ensuring data integrity and consistency.
  • Workflow Orchestration: Utilize orchestration tools like Prefect or Apache Airflow to schedule and monitor data workflows, ensuring timely and reliable data processing.
  • Tech Stack Proficiency: Leverage Python and FastAPI for building data services and APIs, and manage data storage and retrieval using PostgreSQL.
  • End-to-End Workflow Management: Own the entire data workflow process, from ingestion and transformation to delivery, ensuring data quality and availability.

More Info

Industry:Other

Function:Data Engineering

Job Type:Permanent Job

Skills Required

Login to check your skill match score

Login

Date Posted: 03/06/2025

Job ID: 116768661

Report Job

About Company

Hi , want to stand out? Get your resume crafted by experts.

Similar Jobs

Capital Numbers

Senior Data Warehouse Engineer

Capital Numbers
algoleap

Data Engineer

algoleap
Last Updated: 12-06-2025 07:02:56 AM