Responsibilities
- Design and implement modern data architectures: lakehouse, data meshes, and streaming pipelines
- Build transformation layers using dbt and orchestration with Airflow
- Implement data quality frameworks and governance practices
- Create dashboards and analytics solutions using Power BI, Looker, or Tableau
- Design KPI frameworks and data storytelling approaches for engineering metrics
- Collaborate with AI/ML teams to ensure data readiness for model training and inference
Essential Skills
- Data platforms: Databricks, Snowflake, BigQuery, Redshift
- Streaming: Apache Kafka, Flink, Spark Streaming
- Transformation: dbt, Apache Spark, SQL
- Orchestration: Apache Airflow, Dagster, Prefect
- Python for data engineering and testing
- BI tools: Power BI, Looker, Tableau
- Data modelling and lakehouse architecture
Experience
5–10 years in data engineering with experience in industrial or manufacturing data environments preferred