Search by job, company or skills

D

Moon#168 - Senior Data Engineer

1-5 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Over 50 applicants
Quick Apply

Job Description

Dynamic Yield, a Mastercard company (Ethoca), is seeking a Senior Data Engineer to join our team in Pune, India. Mastercard is a global technology company committed to building an inclusive, digital economy by making transactions safe, simple, smart, and accessible worldwide. Our culture, driven by our decency quotient (DQ), fosters an environment where everyone can realize their greatest potential.

This highly visible and critical role will appeal to you if you possess an effective combination of domain knowledge, relevant experience, and the ability to execute on details. You will drive data enablement and explore big data solutions within our technology landscape, bringing cutting-edge software development skills with advanced knowledge of cloud and data lake technologies, all while working with massive data volumes. Our teams are small, agile, and focused on the high-growth fintech marketplace, committed to making our systems resilient, responsive, and easily maintainable on the cloud.

Key Responsibilities

As a Senior Data Engineer, you will:

  • Data Pipeline Development: Design, develop, and optimize batch and real-time data pipelines using Snowflake, Snowpark, Python, and PySpark.
  • Data Transformation: Build data transformation workflows using dbt, with a strong focus on Test-Driven Development (TDD) and modular design.
  • CI/CD Implementation: Implement and manage CI/CD pipelines using GitLab and Jenkins, enabling automated testing, deployment, and monitoring of data workflows.
  • Snowflake Object Management: Deploy and manage Snowflake objects using Schema Change, ensuring controlled, auditable, and repeatable releases across environments.
  • Platform Administration: Administer and optimize the Snowflake platform, handling performance tuning, access management, cost control, and platform scalability.
  • DataOps Practices: Drive DataOps practices by integrating testing, monitoring, versioning, and collaboration into every phase of the data pipeline lifecycle.
  • Data Modeling & Analytics: Build scalable and reusable data models that support business analytics and dashboarding in Power BI.
  • Real-time Streaming: Develop and support real-time data streaming pipelines (e.g., using Kafka, Spark Structured Streaming) for near-instant data availability.
  • Data Observability: Establish and implement data observability practices, including monitoring data quality, freshness, lineage, and anomaly detection across the platform.
  • Deployments & Migrations: Plan and own deployments, migrations, and upgrades across data platforms and pipelines to minimize service impacts, including developing and executing mitigation plans.
  • Stakeholder Collaboration: Collaborate with stakeholders to understand data requirements and deliver reliable, high-impact data solutions.
  • Documentation: Document pipeline architecture, processes, and standards, promoting consistency and transparency across the team.
  • Problem Solving: Apply exceptional problem-solving and analytical skills to troubleshoot complex data and system issues.
  • Communication: Demonstrate excellent written and verbal communication skills when collaborating across technical and non-technical teams.

Required Qualifications

  • Education: Tenured in the fields of Computer Science/Engineering or Software Engineering. Bachelor's degree in computer science or a related technical field, including programming.
  • Big Data Expertise: Deep hands-on experience with Snowflake (including administration), Snowpark, and Python.
  • Distributed Processing: Strong background in PySpark and distributed data processing.
  • Data Transformation: Proven track record using dbt for building robust, testable data transformation workflows following TDD.
  • Schema Change: Familiarity with Schema Change for Snowflake object deployment and version control.
  • CI/CD & Automation: Proficient in CI/CD tooling, especially GitLab and Jenkins, with a focus on automation and DataOps.
  • Real-time Data: Experience with real-time data processing and streaming pipelines.
  • Cloud Database Infrastructure: Strong grasp of cloud-based database infrastructure (AWS, Azure, or GCP).
  • Business Intelligence: Skilled in developing insightful dashboards and scalable data models using Power BI.
  • SQL Mastery: Expert in SQL development and performance optimization.
  • Data Observability: Demonstrated success in building and maintaining data observability tools and frameworks.
  • Operational Excellence: Proven ability to plan and execute deployments, upgrades, and migrations with minimal disruption to operations.
  • Soft Skills: Strong communication, collaboration, and analytical thinking across technical and non-technical stakeholders.

Ideally, you will also have:

  • Experience in banking, e-commerce, credit cards, or payment processing.
  • Exposure to both SaaS and on-premises architectures.
  • A post-secondary degree in computer science, mathematics, or quantitative science.

More Info

Job Type:
Industry:
Function:
Employment Type:
Open to candidates from:
Indian

About Company

Dynamic Yield by Mastercard enables teams to build personalized, optimized, and synchronized digital customer experiences, enhancing revenue and customer loyalty.

Job ID: 118942681