Required Work Experience
- 7 - 12 Years
- Total: 6+ years in Data Engineering, Data Warehousing, and SQL
- Snowflake: 3+ years hands-on experience
Primary Responsibilities:
- Design and implement scalable Data Warehouse solutions using Snowflake as the core platform.
- Understand the difference between Snowflake and traditional data warehouse system
- Develop and optimize complex SQL queries for analytical workloads, reporting, and data transformation.
- Build and maintain ELT/ETL pipelines using Snowflake-native features and external orchestration tools.
- Apply dimensional modeling techniques (Star/Snowflake schemas) to support BI and analytics use cases.
- Perform query profiling, performance tuning, and cost optimization in Snowflake.
- Implement and manage Snowflake objects including tables, views, materialized views, streams, tasks, UDFs and Stored procedures.
- Ensure data quality, lineage, and governance across the data lifecycle.
- Collaborate with cross-functional teams to understand data requirements and translate them into efficient data models and pipelines.
- Troubleshoot and resolve issues related to data ingestion, transformation, and query performance.
- Stay current with Snowflake features such as Time Travel, Zero Copy Cloning, Search Optimization, and Streams/Tasks.
Primary Technical Skills (Must-Have):
- Advanced SQL joins, window functions, CTEs, recursive queries, analytical functions
- Snowflake SQL & Architecture virtual warehouses, micro-partitions, clustering, caching, caling, max concurrency
- Data Modeling Star/Snowflake schemas, normalization/denormalization strategies
- Performance Tuning query optimization, warehouse sizing, result caching
- ETL/ELT Development using Snowflake-native features using stages and orchestration tools (ADF, Airflow, etc.)
- Stored Procedures & Functions SQL and JavaScript-based scripting in Snowflake and exception handling
Secondary Technical Skills (Good or Nice to Have):
- Azure Data Factory (ADF) pipelines, triggers, ADLS Gen2, IR, Linked Services
- Python or PySpark for data transformation and automation
- Snowpipe & Streams/Tasks for real-time and CDC-based ingestion
- Data Security row-level security, masking policies, encryption hierarchy
- DevOps Tools Git, CI/CD, Terraform (basic understanding)
Soft Skills & Methodologies:
- Strong analytical and problem-solving skills
- Excellent communication and documentation abilities
- Experience working in Agile/Scrum environments
- Ability to lead and mentor junior team members
- Comfortable working in fast-paced, multi-project environments
Education & Certifications:
- Bachelor's or Master's in Computer Science, Engineering, or related field
- SnowPro Core Certification is a plus