Design and implement robust Snowflake data warehouse architectures and ETL pipelines to support business intelligence and advanced analytics use cases.
Lead and mentor a team of data engineers, ensuring high-quality and timely delivery of projects.
Collaborate closely with data analysts, data scientists, and business stakeholders to understand data needs and design effective data models.
Develop, document, and enforce best practices for Snowflake architecture, data modeling, performance optimization, and ETL processes.
Own the optimization of Snowflake environments to ensure low-latency and high-availability data access.
Drive process improvements, evaluate emerging tools, and continuously enhance our data engineering infrastructure.
Ensure data pipelines are built with high levels of accuracy, completeness, and security, in compliance with data privacy regulations (GDPR, CCPA, etc.).
Partner with cloud engineering and DevOps teams to integrate data solutions seamlessly within the AWS ecosystem.
Participate in capacity planning, budgeting, and resource allocation for the data engineering function.
What do we need from you
Bachelor's degree in Computer Science, Information Technology, or a related field.
9+ years of overall experience in cloud-based data engineering, with at least 4 years of hands-on experience in Snowflake.
Proven track record in designing, deploying, and managing Snowflake data platforms at scale.
Expertise in AWS services such as S3, Redshift, Lambda, Glue, etc.
Strong command of SQL, Python, and other data manipulation languages.
Experience managing and mentoring a team of engineers and leading cross-functional initiatives.
Strong understanding of data governance, data security, and compliance frameworks (e.g., GDPR, CCPA).
Excellent problem-solving and communication skills.