Actively participate in chapter ceremony meetings and contribute to project planning and estimation.
Coordinate work with product managers, data owners, platform teams, and other stakeholders throughout the SDLC cycle.
Use Airflow, Python, Snowflake, dbt, and related technologies to enhance and maintain EDP acquisition, ingestion, processing, orchestration and DQ frameworks.
Adopt new tools and technologies to enhance framework capabilities.
Build and conduct end-to-end tests to ensure production operations run successfully after every release cycle.
Document and present accomplishments and challenges to internal and external stakeholders.
Demonstrate deep understanding of modern data engineering tools and best practices.
Design and build solutions which are performant, consistent, and scalable.
Contribute to design decisions for complex systems.
Provide L2 / L3 support for technical and/or operational issues.
Qualifications
At least 5+ years experience as a data engineer
Expertise with SQL, stored procedures, UDFs
Advanced level Python programming or Advanced Core Java programming.
Experience with Snowflake or similar cloud native databases
Experience with orchestration tools, especially Airflow
Experience with declarative transformation tools like dbt
Experience in Azure services, especially ADLS (or equivalent)
Exposure to real time streaming platforms and message brokers (e.g., Snowpipe Streaming, Kafka)
Experience with Agile development concepts and related tools (ADO, Aha)
Experience conducting root cause analysis and solve issues
Experience with performance tuning
Excellent written and verbal communication skills
Ability to operate in a matrixed organization and fast-paced environment
Strong interpersonal skills with a can-do attitude under challenging circumstances
Bachelors degree in computer science is strongly preferred