
Search by job, company or skills

The Role
As a Senior Data Engineer, you will design, build, and optimize the data pipelines and models. You'll focus on ingestion, transformation, modelling, quality, and data reliability — ensuring that downstream consumers (analytics, services, UI, AI) can depend on trusted, well-structured data. You'll work closely with backend engineers, product teams, data operations, and architects to design data flows that are scalable, observable, and resilient. This role is deeply hands-on and focused on data engineering excellence.
Key Responsibilities:
· Design and implement robust batch and streaming data pipelines.
· Develop and maintain ingestion frameworks for broker feeds and third-party data sources.
· Build and optimize data transformation workflows (ETL/ELT).
· Design scalable data models (warehouse/lakehouse patterns).
· Implement data quality validation, reconciliation, and monitoring.
· Contribute to metadata, lineage, and governance frameworks.
· Optimize performance and cost of data processing workflows.
· Improve reliability through observability, testing, and automated monitoring.
· Collaborate with backend and platform engineers to define data contracts and interfaces.
· Mentor engineers in best practices for data engineering and modelling.
Essential Criteria
· Strong experience building production-grade data pipelines at scale.
· Deep understanding of data modelling (warehouse, lake, semantic layers).
· Experience with distributed data processing and orchestration frameworks.
· Strong SQL and data transformation skills.
· Experience designing data quality frameworks and validation processes.
· Understanding of data architecture patterns (batch vs streaming, medallion architecture, etc.).
· Strong debugging and performance optimization skills.
· Experience working in cloud-native data environments (preferably AWS, but other major cloud platforms are acceptable).
· Experience designing testing strategies for data pipelines.
· Experience with data workflow orchestration tools (e.g., Temporal, Airflow).
· Familiarity with financial or regulatory datasets – financial trading data experience will be a particular advantage.
· Exposure to semantic modelling, governance, or metadata management.
What We Offer
· A chance to work on genuinely interesting and important problems at scale.
· A collaborative, supportive culture that values curiosity, integrity, and technical excellence.
· The opportunity to influence architecture, tooling, and future direction.
· A team of smart, humble, mission-driven people who care about our goals and about each other.
· A modern, cloud-native tech stack and the freedom to bring your engineering voice to the table.
Job ID: 145957461