Search by job, company or skills

T

Senior Data Engineer (DBT/Snowflake)

0-3 Years

This job is no longer accepting applications

new job description bg glownew job description bg glownew job description bg svg
  • Posted 2 months ago
  • Over 500 applicants

Job Description

About the Role Data Engineer

As a Data Engineer, you'll be an integral part of the team responsible for building and maintaining data pipelines and services that empower business insights across dashboards and reports. The role sits at the intersection of data engineering and architecture, working within a modern data stack and leveraging technologies like Snowflake, BigQuery, Airflow, Fivetran, DBT, and Looker.

Key Responsibilities

  • Design, build, and run end-to-end ETL/ELT pipelines in collaboration with Data Analysts
  • Develop scalable and automated data workflows within a Data Mesh architecture
  • Translate business needs into technical requirements (e.g., DBT models, tests, timings, and reports)
  • Troubleshoot and resolve technical/data pipeline issues as they arise
  • Lead delivery of data models and reports from discovery to deployment
  • Perform exploratory data analysis to identify and prevent data quality issues
  • Optimize data feed availability and performance, including CDC and delta loading techniques
  • Champion data governance through best practices in testing, coding standards, and peer reviews
  • Mentor junior engineers and act as a technical go-to within the team
  • Build Looker dashboards when required
  • Continuously seek ways to enhance pipeline delivery quality and team efficiency

What We're Looking For

  • 3+ years of experience using Snowflake or similar modern data warehouse technology
  • Hands-on expertise with DBT, Apache Airflow, Fivetran, AWS, Git, and Looker
  • Strong command of advanced SQL and performance tuning
  • Experience with custom or SaaS-based data ingestion tools
  • Proven skills in data modeling and optimization of large, complex datasets
  • Background in ETL, data warehousing, data mining, and data pipeline architecture
  • Familiarity with Data Mesh architectures and agile delivery environments (e.g., Scrum)
  • High standards for code quality, including CI/CD, code review, and testing
  • Strong technical documentation and business communication skills
  • Basic understanding of the AWS ecosystem is a plus
  • Fintech or digitally native company experience is desirable
  • Additional skills in Python, data governance tools (e.g., Atlan, Alation, Collibra), or data quality platforms (e.g., Great Expectations, Monte Carlo, Soda) are advantageous

More Info

Job Type:
Employment Type:
Open to candidates from:
Indian

About Company

Tide helps SMEs save time (and money) in the running of their businesses by not only offering business accounts and related banking services, but also a comprehensive set of highly usable and connected administrative solutions from invoicing to accounting.

Tide is transforming the small business banking market with over 1 million globally across the UK, India and Germany. Using advanced technology, all solutions are designed with SMEs in mind.

With quick onboarding, low fees and innovative features, we thrive on making data-driven decisions to help SMEs save both time and money.

Tide was named one of CB Insights global Fintech 250 and Beahurst’s The Fintech Top 50 for the UK in 2023. Tide has also been recognised with the Great Place to Work certification in 2023 and 2024.

Tide facts:
> Tide is available for UK, Indian and German SMEs
> Over 650,000 UK members and growing rapidly
> Over 500,000 India members since December 2022
> Over £200 million raised in funding
> Over 2000 Tideans globally - we’re diversity champions!
> We have offices in Central London, with a member support and technology centre in Sofia, Bulgaria, a technology centre in Hyderabad, India and offices in Gurugram and New Delhi.

Job ID: 120002541