Search by job, company or skills

  • Posted 5 months ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Exp: 5 - 12 Yrs

Work Mode: Hybrid

Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon

Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica.

We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL,

Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+

years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple

platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with

cutting-edge technologies in a collaborative environment and help build scalable, high-performance data

solutions.

Key Responsibilities

  • Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing,

Business Intelligence, and related technologies.

  • Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake,

Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data

sources.

  • SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored

procedures, to support data transformation, reporting, and analysis.

  • Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly

Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data

architectures.

  • Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and

translate business requirements into technical solutions.

  • Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective

resolution and maintaining high data quality standards.

  • Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions.

Create and maintain clear documentation for data processes, data models, and pipelines.

Skills & Qualifications

  • Expertise in Snowflake for data warehousing and ELT processes.
  • Strong proficiency in SQL for relational databases and writing complex queries.
  • Experience with Informatica PowerCenter for data integration and ETL development.
  • Experience using Power BI for data visualization and business intelligence reporting.
  • Experience with Fivetran for automated ELT pipelines.
  • Familiarity with Sigma Computing, Tableau, Oracle, and DBT.
  • Strong data analysis, requirement gathering, and mapping skills.
  • Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP
  • Experience with workflow management tools such as Airflow, Azkaban, or Luigi.
  • Proficiency in Python for data processing (other languages like Java, Scala are a plus).

Skills: dwh,gcp,aws,snowflake,airflow,snowpipe,data analysis,sql,data architect,tableau,performence tuning,pipelines,oracle,etl,data modeling,azure,python,dbt,azkaban,power bi,fivetran,sigma computing,data warehousing,luigi,informatica

More Info

Job Type:
Industry:
Employment Type:

Job ID: 126899215