About The Role
We are looking for a skilled Data Engineer with hands-on expertise in Dagster orchestration or GCP with Bigquery and Apache Airflow, modern data pipeline development, and architecture implementation. The ideal candidate will design, build, and optimize scalable data pipelines with strong SQL proficiency, data modelling expertise.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using Dagster.
- Build and manage Dagster components such as: o Ops / Assets o Schedules o Sensors o Jobs o Resource definitions
- Implement and maintain Medallion Architecture (Bronze, Silver, Gold layers).
- Write optimized and production-grade SQL scripts for transformations and data validation.
- GCP, Big query, Apache Airflow – expertise is must if not familiar with Dagster and orchestration.
Must Have
- 3+ years of experience in Data Engineering.
- Strong hands-on experience with Dagster and workflow orchestration.
- Strong hands-on experience with GCP, Big query and Apache Airflow.
- Solid understanding of data pipeline design patterns.
- Experience implementing Medallion Architecture.
- Advanced SQL skills (complex joins, CTEs, performance tuning).
- Experience working with GCP cloud data platform.
Why Join Us
- Collaborative work environment.
- Exposure to modern tools and scalable application architectures.
- Medical cover for employee and eligible dependents.
- Tax beneficial salary structure.
- Comprehensive leave policy
- Competency development training programs.
Skills:- dagster , SQL, Data engineering, Google BigQuery, Google Cloud Platform (GCP), Data modeling and etl pipeline