Search by job, company or skills

VidPro Consultancy Services

Snowflake Developer (Architect)

new job description bg glownew job description bg glownew job description bg svg
  • Posted 9 months ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Exp: 8 - 14 Yrs

Work Mode: Hybrid

Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon

Primary Skills: Architect, Snowflake, Snowpipe, SQL, DWH, Power BI, ETL and Informatica.

We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 8+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high performance data solutions.

Key Responsibilities

  • Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies.
  • Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources.
  • SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis.
  • Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures.
  • Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions.
  • Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards.
  • Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions.
  • Create and maintain clear documentation for data processes, data models, and pipelines.

Skills & Qualifications

  • Expertise in Snowflake for data warehousing and ELT processes.
  • Strong proficiency in SQL for relational databases and writing complex queries.
  • Experience with Informatica PowerCenter for data integration and ETL development.
  • Experience using Power BI for data visualization and business intelligence reporting.
  • Experience with Fivetran for automated ELT pipelines.
  • Familiarity with Sigma Computing, Tableau, Oracle, and DBT.
  • Strong data analysis, requirement gathering, and mapping skills.
  • Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP
  • Experience with workflow management tools such as Airflow, Azkaban, or Luigi.
  • Proficiency in Python for data processing (other languages like Java, Scala are a plus).

Education

Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field.

Skills: pipelines,data,sql,dbt,power bi,azure,data analysis,workflow management,airflow,etl,architect,data warehousing,informatica,azkaban,dwh,data integration,cloud services,fivetran,python,data modeling,business intelligence,snowflake,luigi,data engineering,skills

More Info

Job Type:
Industry:
Employment Type:

Job ID: 111705483