Senior Snowflake Developer JD
About QualMinds
We at QualMinds design & develop world-class digital products and custom software for Startups, Scale-ups & Enterprises with our world-class engineering team.
We assemble a great team of frontend, backend, full-stack, QA automation, UI/UX, DevOps, data science, machine learning, and agile experts to help our clients propel their pursuit of digital excellence.
Our engineering community at QualMinds is always passionate about building customer-centric software with the highest quality, performance, security, scalability, and lowest failure rates.
We are looking for an experienced
Snowflake Developer / Senior Data Engineer with strong expertise in building scalable data pipelines, optimizing Snowflake environments, and delivering high-quality data solutions. The ideal candidate will have 510 years of hands-on experience in data engineering with a deep understanding of Snowflake, SQL, and cloud-based ETL tools.
Key Responsibilities
- Snowflake Development & Architecture
- Design and develop scalable Snowflake data models including fact, dimension, and staging layers.
- Define and implement clustering keys, partition strategies, and effective micro-partition usage for large tables.
- Optimize Snowflake storage and compute architecture based on business workloads.
- Data Ingestion & Pipeline Development
- Build and maintain ETL/ELT pipelines using: Snowflake SQL, Snowpipe, Azure Data Factory (ADF), Matillion
- Implement incremental / delta load mechanisms and ensure data freshness.
- Perform data validation, deduplication, and consistency checks across sources and targets.
- SQL & Stored Procedure Development
- Write complex SQL queries, views, and stored procedures using: Snowflake Scripting, JavaScript (for Snowflake stored procedures)
- Conduct SQL performance tuning and query optimization.
- Performance Tuning & Optimization
- Analyze and troubleshoot slow-running queries using Query Profile and Snowflake monitoring tools.
- Optimize performance through warehouse sizing, caching strategies, micro-partition pruning, and query structure improvements.
- Ensure efficient resource consumption and cost optimization.
- Security, Governance & Access Control
- Implement and maintain RBAC (Role-Based Access Control).
- Manage roles, users, privileges, and secure data sharing.
- Ensure compliance with organizational security standards and best practices.
- Monitoring, Troubleshooting & Operations
- Implement robust error handling including logging, retry mechanisms, and alerting.
- Monitor pipeline performance and proactively identify issues.
- Collaborate with cross-functional teams for requirement gathering, solution design, and production support.
Required Skills & Qualifications
- 510 years of experience in Data Engineering or related roles.
- 3+ years hands-on experience working with Snowflake.
- Strong expertise in SQL, stored procedures, and performance tuning.
- Hands-on experience with ETL/ELT tools such as Azure Data Factory, Matillion, or Snowpipe.
- Solid understanding of data warehousing concepts, dimensional modeling, and data architecture.
- Experience with RBAC, security best practices, and data governance.
- Familiarity with cloud platforms (Azure/AWS/GCP preferred).
- Strong debugging, problem-solving, and analytical skills.
- Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or related field.
Preferred Skills
- Experience with Python or PySpark for data pipeline development.
- Exposure to CI/CD, DevOps practices, and Git-based development workflows.
- Knowledge of monitoring tools and performance analytics.