Architecture & Strategy
- Own a significant portion of the data platform architecture, ensuring scalability, performance, reliability, and security.
- Define technical standards and best practices for data modeling, transformation, orchestration, governance, and lifecycle management.
- Evaluate and integrate modern data technologies that align with the long-term platform strategy.
- Collaborate with engineering and product leadership to shape the technical roadmap.
Engineering & Delivery
- Design, build, and manage scalable, resilient data pipelines for batch, streaming, and event-driven workloads.
- Develop high-quality data models and schemas to support analytics, BI, operational systems, and ML workflows.
- Implement data quality, lineage, observability, and automated testing frameworks.
- Build ingestion patterns for APIs, event streams, files, and third-party data sources.
- Optimize compute, storage, and transformation layers for performance and cost efficiency.
Leadership & Mentorship
- Serve as a senior technical leader and mentor within the data engineering team.
- Lead architecture reviews, design discussions, and cross-team engineering initiatives.
- Guide analysts, data scientists, software engineers, and product owners in delivering robust data solutions.
- Communicate architectural decisions and trade-offs to both technical and non-technical stakeholders.
Required Qualifications & Skills:
- 8+ years of experience in Data Engineering with proven architectural ownership.
- Expert-level experience with Snowflake, including performance optimization, data modeling, security, and ecosystem components.
- Expert proficiency in SQL and strong Python skills for pipeline development and automation.
- Experience with modern orchestration tools such as Airflow, Dagster, Prefect, or equivalents.
- Strong understanding of ELT/ETL patterns, distributed processing, and data lifecycle management.
- Familiarity with streaming/event technologies (Kafka, Kinesis, Pub/Sub).
- Experience implementing data quality, observability, and lineage solutions.
- Solid understanding of cloud infrastructure (AWS, GCP, or Azure).
- Strong background in DataOps practices: CI/CD, testing, version control, automation.
- Proven leadership in driving architectural direction and mentoring engineering teams.
Nice to Have:
- Experience with data governance or metadata management tools.
- Hands-on experience with DBT (modeling, testing, documentation, advanced features).
- Exposure to machine learning pipelines, feature stores, or MLOps.
- Experience with Terraform, CloudFormation, or other IaC tools.
- Experience designing systems for high scale, security, or regulated environments.