Search by job, company or skills

Indium

Indium Software - Snowflake Data Architect - ETL/Data Warehousing

new job description bg glownew job description bg glownew job description bg svg
  • Posted a month ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Description

  • 8+ years of experience in IT, including 6+ years of experience as an architect. Snowflake Data Architect, Cortex AI. Design enterprise data architecture and integration patterns in Snowflake.
  • Implement Cortex AI features for ML models and predictions.
  • Administer Striim pipelines for real-time CDC data ingestion.
  • Optimize data flows between Striim, Snowflake, and source systems. Define governance standards for AI/ML workflows and streaming data.

We are seeking an experienced Snowflake Data Architect with strong expertise in enterprise data architecture, real-time data integration, and AI/ML enablement. The ideal candidate will play a key role in designing scalable data platforms on Snowflake, implementing Cortex AI capabilities, and managing real-time CDC pipelines using Striim.

This role requires deep architectural experience, hands-on implementation skills, and the ability to define governance standards for streaming data and AI-driven workflows.

Key Responsibilities

  • Design and implement enterprise-grade data architecture on Snowflake
  • Define scalable data models, integration patterns, and best practices
  • Architect data lakes, data warehouses, and data marts on Snowflake
  • Ensure high performance, scalability, and cost optimization of Snowflake environments
  • Implement and optimize Snowflake Cortex AI features for ML models, predictions, and analytics
  • Enable AI/ML workloads directly within Snowflake
  • Collaborate with data science teams to operationalize ML models
  • Define standards for AI model deployment, monitoring, and lifecycle management
  • Administer and manage Striim pipelines for real-time CDC (Change Data Capture) ingestion
  • Optimize data flows between source systems, Striim, and Snowflake
  • Ensure low-latency, reliable, and fault-tolerant data pipelines
  • Troubleshoot performance and data consistency issues across streaming systems
  • Design and implement batch and real-time ETL/ELT pipelines
  • Integrate data from multiple sources (databases, applications, APIs, cloud services)
  • Ensure data quality, validation, and reconciliation across pipelines
  • Define and enforce data governance standards for streaming data and AI/ML workflows
  • Implement role-based access control (RBAC), data masking, and encryption
  • Ensure compliance with enterprise security and regulatory requirements
  • Establish data lineage, auditing, and monitoring frameworks
  • Act as a technical architect guiding data engineering and analytics teams
  • Collaborate with stakeholders to translate business requirements into technical solutions
  • Review designs, provide technical guidance, and mentor team members
  • Contribute to architecture documentation and best-practice guidelines

Required Skills & Expertise

  • 8+ years of overall IT experience
  • 6+ years of experience in a Data Architect / Solution Architect role
  • Strong hands-on experience with Snowflake Data Platform
  • Snowflake architecture, performance tuning, and cost optimization
  • Snowflake Cortex AI for ML models and predictive analytics
  • Advanced SQL and data modeling techniques
  • Experience with ETL/ELT frameworks and orchestration tools
  • Hands-on experience with Striim for real-time CDC pipelines
  • Strong understanding of streaming data architectures
  • Experience integrating Snowflake with on-prem and cloud-based source systems
  • Understanding of AI/ML workflows and model deployment
  • Experience enabling analytics and predictive use cases on data platforms
  • Knowledge of feature engineering and data preparation for ML
  • Experience with cloud platforms (AWS / Azure / GCP)
  • Familiarity with CI/CD for data pipelines
  • Infrastructure-as-code and automation exposure
  • Bachelors or Masters degree in Computer Science, Data Engineering, or a related field
  • Strong analytical, problem-solving, and communication skills
  • Ability to work independently and lead architectural initiatives
  • Experience with Kafka or other streaming platforms
  • Knowledge of data catalogs and metadata management tools
  • Exposure to enterprise BI and reporting tools
  • Experience in large-scale enterprise data modernization programs

(ref:hirist.tech)

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 139653715