
Search by job, company or skills
Key Responsibilities:
Snowflake Engineering & Architecture
· Design and develop scalable data models, semantic layers, and Materialized Views within Snowflake to support complex downstream reporting and analytics.
· Write, test, and deploy highly optimized, complex SQL queries, stored procedures, and User-Defined Functions (UDFs).
· Leverage Snowpark (Python/Scala) for advanced data transformations and feature engineering directly within the Snowflake compute environment.
· Work with semi-structured and unstructured data formats (JSON, Parquet, XML) utilizing Snowflake's native capabilities (Variant data type, zero-copy cloning, time travel).
Performance Tuning & Cost Optimization
· Actively monitor and tune Snowflake query performance, identifying bottlenecks and optimizing execution plans.
· Manage and optimize Virtual Warehouse sizing, scaling policies, and clustering keys to ensure a balance of high performance and cost-efficiency.
· Implement effective caching strategies and data partitioning techniques for large-scale datasets.
Data Integration & Security
· Collaborate with integration teams to build seamless data pipelines using Informatica Intelligent Cloud Services (IICS) and AWS cloud services into Snowflake.
· Implement and enforce strict data security models, including Role-Based Access Control (RBAC), Row-Level Security, and Dynamic Data Masking, ensuring compliance with HIPAA, GDPR, and internal GxP policies.
· Participate in cross-team impact assessments to ensure schema evolutions or logic changes do not disrupt downstream consumers or existing data platforms.
Qualifications & Skills:
· Experience: 6–9 years of overall IT experience in Data Engineering or Data Warehousing, with at least 3+ years of dedicated, hands-on experience developing in Snowflake.
· Certifications: Snowflake SnowPro Core Certification is required. SnowPro Advanced: Data Engineer certification is highly preferred.
· Technical Skills: * Expert-level proficiency in ANSI SQL and Snowflake-specific SQL extensions.
o Strong programming skills in Python (specifically for Snowpark and automation).
o Experience with cloud integration tools (IICS preferred) and foundational knowledge of AWS services (S3, EC2, IAM).
· Domain Experience: Prior experience working in the Life Sciences, Pharmaceutical, or Healthcare industry is highly advantageous, particularly with an understanding of GxP validation processes.
· Execution: Strong understanding of Agile methodologies, Jira, and modern DevOps practices for data platforms. Proven ability to translate complex business requirements into high-performing technical solutions.
Job ID: 147481129
We don’t charge any money for job offers