Search by job, company or skills

Uplers

Data Engineer

Save
new job description bg glownew job description bg glow
  • Posted 6 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Experience: 3.00 + years

Salary: INR 2500000-2600000 / year (based on experience)

Expected Notice Period: 15 Days

Shift: (GMT+05:30) Asia/Kolkata (IST)

Opportunity Type: Office ()

Placement Type: Full Time Permanent position(Payroll and Compliance to be managed by: An AI-powered gaming commerce platform)

(*Note: This is a requirement for one of Uplers client - An AI-powered gaming commerce platform)

What do you need for this opportunity

Must have skills required:

ClickHouse, E-commerce/Marketplace experience., ML/Data Science skills., Spark., Airflow (or similar orchestrator, e.g., Experience with any OLAP database (e.g., Luigi, Prefect)., Redshift)., web scrapers or data collection pipelines, Python

An AI-powered gaming commerce platform is Looking for:

Data Engineer — Recommendation Engine

About The Company

PS is India''s first Gaming Commerce company, pioneering a new way for 500M+ gamers—an audience growing at 19.6% YoY—to shop inside games.

It is a white-label SaaS plugin that integrates into casual and hyper-casual games, transforming in-game coins into real-world value. Players can redeem them for brand coupons, digital services, or even physical products inside fully customizable in-game stores.

The Role

We''re building a recommendation engine that surfaces the right products to the right player at the right moment inside our in-game store. A core challenge: we have limited in-app behavioural data today, so the system must rely heavily on external market signals to make great recommendations from day one.

As our first Data Engineer, you will own the data infrastructure that makes this possible. You''ll build the pipelines that collect, process, and serve these signals into our real-time ranking system.

This is a foundational hire. The entire recommendation engine—from Deal Quality Scores to seasonal trends—depends on the pipelines you build.

What You''ll Do

  • Own external data pipelines—scrapers for Flipkart/Amazon bestseller rankings, PriceHunt/Smartprix for price benchmarking, Google Trends API for brand and category demand signals
  • Build the Deal Quality Score pipeline—a daily batch job that computes a competitiveness score for every product in our catalogue, stored in Redis for sub-millisecond lookup at serving time
  • Maintain a seasonal and festive calendar—structured data store for trend overlays (IPL, Diwali, back-to-school, etc.)
  • Design and own the in-store event schema in ClickHouse that will power behavioural cohorts as in-app data accumulates
  • Build ETL infrastructure (S3 + Spark/Glue) for longer-horizon trend and market data
  • Own data quality and freshness SLAs—you are responsible when a signal the reco engine depends on breaks silently

What We''re Looking For


  • 3–5 years of data engineering experience, ideally at a startup or product company
  • Strong Python—you write clean, production-grade pipeline code, not just notebooks
  • Experience building and maintaining web scrapers or data collection pipelines at scale
  • Hands-on experience with a workflow orchestrator—Airflow, Prefect, or equivalent
  • Solid SQL experience with ClickHouse or another OLAP database is a strong plus
  • Familiarity with Redis as a serving layer—you understand TTL, key design, and cache invalidation
  • Comfortable with AWS—S3, Glue, ECS you can set up infra without needing DevOps help
  • You care about data quality—you monitor pipelines, set up alerts, and feel responsible when something breaks

Strong Plus (Nice to Have)


  • Experience with e-commerce or marketplace data—price intelligence, product catalogues, category taxonomy
  • Familiarity with recommender system data patterns
  • Experience with Spark or distributed processing for larger datasets
  • Prior work on gaming or consumer mobile products

Location: Gurgaon

Reports to: CTO

Why PS

  • Your pipelines feed the system that directly drives real GMV for our studio partners
  • Small team, fast decisions, no bureaucracy
  • Work directly with the CTO on architecture and direction
  • Direct impact visible within weeks of shipping

How to apply for this opportunity


  • Step 1: Click On Apply! And Register or Login on our portal.
  • Step 2: Complete the Screening Form & Upload updated Resume
  • Step 3: Increase your chances to get shortlisted & meet the client for the Interview!

About Uplers:


Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement.

(Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well).

So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147480865

Similar Jobs

Gurugram, India

Skills:

PysparkGcpSparkAzurePythonAWS

Noida, India

Skills:

PysparkApache SparkAzure DatabricksData ModelingData WarehousingSqlGitBI tools integrationAzure Blob StorageDenormalizationAzure Data Lake StorageNormalizationETL pipelines

Remote

Skills:

data engineering PythonPysparkAWS GlueDockerAWS Batch

Gurugram, Gurugram, India

Skills:

BigQueryGoogle Cloud PlatformApache SparkDataprocSqlELTCloud StorageDataFlowPythonEtlAirflowPub Sub

Gurugram, Gurugram, India

Skills:

Data Warehousing ConceptsData ModelingPythonSqlETL processesdata integration techniquesdatabase design principlesGoogle BigQuery