
Search by job, company or skills
Duties & Responsibilities
Design, develop, and maintain scalable ETL/ELT data pipelines to support business and analytics needs
Write, tune, and optimize complex SQL queries for data transformation, aggregation, and analysis
Translate business requirements into well-designed, documented, and reusable data solutions
Partner with analysts, data scientists, and stakeholders to deliver accurate, timely, and trusted datasets
Automate data workflows using orchestration/scheduling tools (Airflow, ADF, Luigi, etc.)
Develop unit tests, integration tests, and validation checks to ensure data accuracy and pipeline reliability
Document pipelines, workflows, and design decisions for knowledge sharing and operational continuity
Apply coding standards, version control practices, and peer code reviews to maintain high-quality deliverables
Proactively troubleshoot, optimize, and monitor pipelines for performance, scalability, and cost efficiency
Support function rollouts, including being available for post-production monitoring and issue resolution
Requirements
Qualifications
Basic Qualifications
Bachelor's degree in computer science, Information Systems, Engineering, or a related field
6 -8 years of hands-on experience in data engineering and building data pipelines
At least 3 years of experience in writing complex SQL queries in a cloud data warehouse/ data lake environment.
Solid hands-on experience with data warehousing concepts and implementations
At least 1 year of experience with Snowflake or another modern cloud data warehouse
At least 1 year of hands-on Python development (scripting, OOP, ETL/ELT automation)
Familiarity on Data modeling and Data warehousing concepts
Experience with orchestration tools (e.g., Airflow, ADF, Luigi)
Familiarity with at least one cloud platform (AWS, Azure, or GCP)
Experience with DBT (Data Build Tool) for data transformations
Familiarity with CI/CD and version control (Git) in data engineering projects
Strong analytical, problem-solving, and communication skills
Ability to work both independently and as part of a collaborative team
Preferred Qualifications
Exposure to real-time/streaming platforms (Kafka, Spark Streaming, Flink)
Exposure to the e-commerce or customer data domain
Understands the technology landscape, up to date on current technology trends and new technology, brings new ideas to the team
Job ID: 144008271