
Search by job, company or skills
Job title: Data Engineer 3
Location: Gurugram, India
Reporting to: Engineering Manager
About noon
We're building an ecosystem of digital products and services that power everyday life across the Middle East—fast, scalable, and deeply customer-centric. Our mission is to deliver to every door every day. We want to redefine what technology can do in this region, and we're looking for a (add title) who can help us move even faster.
noon's mission: Every door, every day.
About Dubai Mall Shopping
With over 100 million visitors each year, Dubai Mall is a world-class destination for discerning, high-value shoppers. Now, powered by noon's technology and our trusted logistics, this premium experience is available online, free from crowds, parking hassles, or restrictive store hours. A truly exceptional mall experience, instantly at your fingertips.
What you'll do:
Data will sit at the core of how we operate and scale — from powering real-time experiences to enabling smarter decisions for both customers and merchants. This role is about building the foundation early and getting it right. You won't just be consuming data; you'll define how it flows across the system.
● Design and build reliable, scalable data pipelines that support both real-time and batch use
cases.
● Enable key product capabilities such as personalization, recommendations, and operational
insights.
● Partner closely with backend, frontend, and product teams to ensure data is accurate, accessible,
and actionable.
● Establish strong data modeling practices and lay the groundwork for a robust data platform.
● Improve data quality, observability, and performance as the system scales.
● Contribute to decisions around tooling, architecture, and long-term data strategy.
What You'll Need
● 5+ years (Mid), 8+ years (Senior/Lead) experience in data engineering or a related field.
● Strong proficiency in SQL and Python, with experience in production data systems.
● Experience building and maintaining data pipelines (ETL/ELT), both batch and streaming.
● Hands-on experience with distributed processing frameworks (e.g., Spark, Flink, or similar).
● Experience with orchestration tools (e.g., Airflow or similar).
● Familiarity with messaging/streaming systems (e.g., Google Pub/Sub, Kafka or equivalents).
● Experience with cloud platforms (GCP preferred) and modern data tooling.
● Understanding of data warehousing concepts and data modeling best practices.
● Comfort working in a fast-moving environment with evolving requirements.
● Strong problem-solving skills and attention to detail.
● Clear communication and a collaborative mindset.
Who will excel
Job ID: 147505143
Skills:
Pyspark, Gcp, Spark, Azure, Python, AWS
Skills:
Pyspark, Apache Spark, Azure Databricks, Data Modeling, Data Warehousing, Sql, Git, BI tools integration, Azure Blob Storage, Denormalization, Azure Data Lake Storage, Normalization, ETL pipelines
Skills:
data engineering , Python, Pyspark, AWS Glue, Docker, AWS Batch
Skills:
BigQuery, Google Cloud Platform, Apache Spark, Dataproc, Sql, ELT, Cloud Storage, DataFlow, Python, Etl, Airflow, Pub Sub
Skills:
Data Warehousing Concepts, Data Modeling, Python, Sql, ETL processes, data integration techniques, database design principles, Google BigQuery
We don’t charge any money for job offers