We are looking for a highly skilled Data Engineer to architect, build, and scale our modern data platform. This role will be responsible for designing robust data models, building scalable ELT pipelines, optimising Snowflake performance, and establishing engineering best practices across the data team.
You will work cross-functionally with Analytics, Product, and Engineering teams to ensure reliable, high-performance data solutions that power business decisions.
Requirements
- Bachelor's degree in Computer Science, Information Technology, or a related field with 5+ years of IT experience.
- 4+ years of experience in Data Engineering
- Deep hands-on expertise in Snowflake (architecture, optimization, security, RBAC)
- Strong experience in DBT (models, macros, testing, incremental models)
- Advanced SQL skills (complex joins, window functions, query optimization)
- Strong proficiency in Python (data processing, automation, scripting)
- Solid understanding of data modelling concepts (fact/dimension tables, SCD types, normalization vs denormalization)
- Experience designing production-grade data pipelines
Preferred / Good to Have
- Experience with orchestration tools (Airflow, Dagster, etc.)
- Experience with cloud platforms (AWS / Azure / GCP)
- CI/CD implementation for data workflows
- Experience handling large-scale datasets (TB+ scale)
- Exposure to data governance and compliance standards
Signs you may be a great fit
- Impact: Play a pivotal role in shaping a rapidly growing venture studio.
- Culture: Thrive in a collaborative, innovative environment that values creativity and ownership.
- Growth: Access professional development opportunities and mentorship.
- Benefits: Competitive salary, health/wellness packages, and flexible work options.