Dynamic Yield, a Mastercard company, is a global technology company at the forefront of the world's fastest payments processing network. We are dedicated to connecting everyone to endless, priceless possibilities by being a vehicle for commerce, a link to financial systems for the previously excluded, and a hub for technology innovation. We believe in empowering our employees to be part of something bigger and to change lives.
Our Services organization is a key differentiator for Mastercard, delivering cutting-edge services that help the world's largest organizations make multi-million dollar decisions and grow their businesses. This agile team is focused on end-to-end solutions for a diverse global customer base, centered on data-driven technologies and innovation.
Within the Data & Services Technology Team, the Data Analytics and AI Solutions (DAAI) program is a dynamic and growing initiative. DAAI comprises a rich set of products that provide accurate perspectives on Portfolio Optimization and Ad Insights. We are currently enhancing customer experience with new user interfaces, moving to API and web application-based data publishing for seamless integration, leveraging new datasets and algorithms to advance analytic capabilities, and generating scalable big data processes.
We are seeking an innovative Data Engineer to contribute to the technical design and development of an Analytic Foundation. This foundation is a suite of commercialized analytical capabilities (such as prediction as a service or forecasting as a service) that includes a comprehensive data platform. These services will be offered through APIs, delivering data and insights from a central data store. You'll partner closely with other business areas to build and enhance solutions that drive significant value for our customers.
Engineers at Dynamic Yield work in small, flexible teams where every member contributes to designing, building, and testing features. The work varies from building intuitive UIs to designing backend data models and architecting data flows. If you're ready to join a new, fast-growing engineering team and make a real impact, we want to hear from you!
Position Responsibilities
As a Data Engineer within DAAI, you will:
- Feature Implementation: Play a large role in the implementation of complex features, pushing the boundaries of analytics and powerful, scalable applications.
- Data Model Development: Build and maintain analytics and data models to enable performant and scalable products.
- Code Quality: Ensure a high-quality codebase by writing and reviewing performant, well-tested code.
- Mentorship: Mentor junior engineers and teammates, sharing your expertise and contributing to their growth.
- Process Improvement: Drive innovative improvements to team development processes.
- Stakeholder Collaboration: Partner with Product Managers and Customer Experience Designers to develop a deep understanding of users and use cases, applying that knowledge to scoping and building new modules and features.
- Cross-functional Teamwork: Collaborate across teams with exceptional peers who are passionate about their work.
Ideal Candidate Qualifications
- Experience:4+ years of data engineering experience in an agile production environment.
- Technical Leadership: Experience leading the design and implementation of large, complex features in full-stack applications.
- Business Acumen: Ability to easily move between business, data management, and technical teams; quickly intuiting the business use case and identifying technical solutions to enable it.
- Analytical Tooling: Experience leveraging open-source tools, predictive analytics, machine learning, advanced statistics, and other data techniques to perform analyses.
- Big Data Proficiency: High proficiency in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Sqoop), and SQL to build Big Data products & platforms.
- Production Data Solutions: Experience in building and deploying production-level data-driven applications and data processing workflows/pipelines, and/or implementing machine learning systems at scale in Java, Scala, or Python. This includes delivering analytics across all phases: data ingestion, feature engineering, modeling, tuning, evaluating, monitoring, and presenting.
- Cloud Technologies: Experience in cloud technologies like Databricks/AWS/Azure.
- Technological Aptitude: Strong technologist with a proven track record of learning new technologies and frameworks.
- Customer Focus: A strong customer-centric development approach.
- Problem-Solving: Passion for analytical / quantitative problem-solving.
- Process Optimization: Experience identifying and implementing technical improvements to development processes.
- Collaboration: Excellent collaboration skills with experience working with people across various roles and geographies.
- Personal Drive: Motivation, creativity, self-direction, and a desire to thrive on small project teams.
- Education: Superior academic record with a degree in Computer Science or a related technical field.
- Communication: Strong written and verbal English communication skills.