Search by job, company or skills

zorba ai

Data Engineer_4+Years

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 4 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Role Overview

We are seeking a highly skilled Data Engineer / Data Analyst to enhance and scale our reporting and analytics ecosystem. This role will focus on designing, building, and optimizing data pipelines, data models, and reporting solutions that power enterprise analytics.

The ideal candidate brings strong expertise in Snowflake, Databricks, Power BI, and Cognos, and thrives in a global, collaborative environment.

Key Responsibilities

  • Design, develop, and optimize scalable ETL/ELT pipelines using Talend, Databricks, Azure Data Factory, and Snowflake
  • Translate business and functional requirements into robust technical solutions
  • Drive performance tuning, cost optimization, and scalability of data pipelines
  • Build and maintain Power BI datasets, dashboards, and reports for actionable insights
  • Ensure data governance, security, and privacy compliance across all pipelines
  • Collaborate with global stakeholders to gather reporting requirements and deliver solutions
  • Develop and maintain technical and functional documentation
  • Implement and manage CI/CD pipelines for data workflows and reporting solutions
  • Use GitHub for version control and collaborative development
  • Monitor, troubleshoot, and ensure high availability of data workflows
  • Support solution design, testing, and deployment phases

Required Skills & Qualifications

  • Strong expertise in Snowflake (data modeling, query optimization, performance tuning)
  • Advanced Power BI skills (DAX, dashboard design, data modeling, performance optimization)
  • Proficiency in SQL and relational database concepts
  • Experience with CI/CD pipelines and GitHub
  • Hands-on experience with Azure Data Factory, Databricks, Synapse, and Cognos
  • Familiarity with Python or other scripting languages
  • Strong understanding of data warehousing and reporting architectures
  • Ability to work effectively in a global, cross-functional environment

Preferred Qualifications

  • 4+ years of ETL experience in large-scale data engineering projects
  • Experience in Data Engineering with DevOps practices
  • Proficiency in SQL, Python/PySpark, Shell scripting, or Scala
  • Hands-on experience with Azure Functions, Logic Apps, and Spark platforms

Nice to Have

  • Exposure to cloud-based analytics ecosystems (Azure preferred)
  • Experience working with multi-region/global teams
  • Knowledge of data governance frameworks and best practices

Skills: sql,snowflake,congos,databricks,power bi

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147276851