About Gen
Gen is a global company dedicated to powering Digital Freedom through its trusted consumer brands including Norton, Avast, LifeLock, MoneyLion and more. Our combined heritage is rooted in financial empowerment and cyber safety for the first digital generations, and today we deliver award-winning cybersecurity, online privacy, identity protection and financial wellness solutions to nearly 500 million users in more than 150 countries.
Together, we share a collective passion and vision to protect consumers and help them grow, manage and secure their digital and financial lives. We're always looking for smart, fearless and high-impact talent who see AI as a teammate – leveraging it to move faster and deliver meaningful results.
When you're part of Gen, you'll have the flexibility, tools and support to do your best work and grow your career – from flexible working options and time off to competitive pay, benefits and well-being programs.
At Gen, we are scrappy and relentlessly customer driven. We create room for healthy debate, experimentation and continuous learning, and we seek out people with different experiences, identities and ideas to join our team. You'll work with people who back each other, respect each other and understand that our differences are a competitive advantage.
If this sounds like you, we'd love you to be part of Gen.
About The Role
We are looking for a Data Platform Engineer to join our Data Platform Operations team. In this role, you will own the day‑to‑day operation and monitoring of our data platforms and pipelines, and drive continuous improvements from an operational and maintenance perspective.
You will work closely with data engineers, analysts, and platform teams to ensure our data platforms are stable, observable, cost‑efficient, and ready to support new use cases.
Key Responsibilities
- Operate and monitor data platforms (Azure and on‑prem / legacy) to ensure availability, performance, and reliability.
- Run and support data processing workflows, including incident handling, troubleshooting failed jobs, and coordinating fixes with development teams.
- Implement and improve observability (monitoring, alerting, logging, SLAs/SLOs) for data platforms and pipelines.
- Drive continuous improvements in data platform operations:
- Simplify and automate recurring operational tasks.
- Reduce manual interventions and operational risk.
- Improve deployment, rollback, and change‑management practices.
- Collaborate with data engineers and product teams to operationalize new data products and integrate them into existing platform standards.
- Contribute to operational documentation (runbooks, standards, best practices) and knowledge sharing across the data organization.
About You
- Hands‑on experience in data platform operations, data engineering, or a related role.
- Strong understanding of data processing concepts (batch, scheduling, dependencies, failures, backfills).
- Practical experience with at least some of the Azure data services and/or traditional ETL tools listed above.
- Comfortable working with SQL and modern version‑control workflows (branches, pull requests, code reviews).
- Problem‑solving mindset with a focus on stability, reliability, and continuous improvement rather than one‑off fixes.
- Good communication skills; able to work with both technical and non‑technical stakeholders.
- We are looking for someone who either has, or is keen to develop, experience with a modern data stack, ideally including: AWS, Snowflake, dbt / SQLMesh for transformation and data modeling, Git / GitHub and modern CI/CD practices for data and infrastructure, Kubernetes (container‑based workloads and operations), Kafka or similar event streaming platforms , IaaC (infrastructure‑as‑code) and platform automation,
- We are looking for someone who has solid scripting / programming skills for automation and tooling, ideally: SQL / Procedural SQL, Python, Bash
- Ability to build small tools and utilities to automate operational tasks, improve monitoring, or integrate systems.