3+ years of experience in data engineering or data product operations
Proficiency in Python and SQL, especially in Snowflake and BigQuery
Experience with web scraping frameworks and data governance
Someone capable of working with unstructured or alternative data sources
Competence in deploying solutions on Google Cloud Platform (GCP), particularly BigQuery and Cloud Functions along with experience with Snowflake for data modeling and performance tuning.
Knowledge of frontend/ backend development (React, APIs, Python Flask or FastAPI, databases, cloud technologies) is a plus
Skills in ETL/ELT pipeline development and automated workflows
Strong problem-solving skills and attention to detail.
Expertise in anomaly detection, data drift, or schema changes
Perspective of treating datasets as products. Lay SLAs, roadmaps, and metrics that can derive change
Project management experience using Agile, Kanban, or similar methodologies
Excellent documentation skills for technical specs, runbooks, and SOPs
Effective communication skills for collaboration between data engineering and business teams
Leadership readiness, demonstrated through mentoring, roadmap planning, and team coordination.