Search by job, company or skills

  • Posted a month ago
  • Be among the first 40 applicants
Early Applicant
Quick Apply

Job Description

As a Solution Architect Snowflake, you will play a crucial role in designing, building, and maintaining scalable data pipelines and infrastructure. You will work with a variety of technologies, including Scala, Python, Spark, AWS services, and SQL, to support our data processing and analytics needs.

Responsibilities: -

  • Collaborate with stakeholders to finalize the scope of enhancements and development projects, gather detailed requirements.
  • Apply expertise in ETL/ELT processes and tools to design and implement data pipelines that fulfil business requirements.
  • Provide expertise as a technical resource to solve complex business issues that translate into data integration and database systems designs.
  • Migrate and modernize existing legacy ETL jobs for Snowflake, ensure data integrity and optimal performance.
  • Analyze existing ETL jobs and identify opportunities for creating reusable patterns and components to expedite future development.
  • Develop and implement a configuration-driven Data Ingestion framework that enables efficient onboarding of new source tables.
  • Collaborate with cross-functional teams, including business analysts and solution architects, to align data engineering initiatives with business goals.
  • Drive continuous improvement initiatives, enhance data engineering processes, tools, and frameworks.
  • Ensure compliance with data quality, security, and privacy standards across all data engineering activities.
  • Participate in code reviews, provide constructive feedback, and ensure high-quality, maintainable code.
  • Prepare and present technical documentation, including data flow diagrams, ETL specifications, and architectural designs.

Educational Qualifications:

  • Engineering Degree BE/ME/BTech/MTech/BSc/MSc.
  • Cloud certifications (AWS, etc.) and relevant technical certification in multiple technologies is desirable.

Skills:

Mandatory Technical Skills:-

  • Should have strong experience in Snowflake and must have executed development and migration projects involving Snowflake
  • Should have strong working experience in ETL tools (Matillion/ DBT/Fivetron/ADF preferably)
  • Experience in SQL writing including flatten tables and experience in JSON will be good to have, and able to write complex queries.
  • Strong understanding of SQL queries, good coding experience on Python, deploying into Snowflake data warehousing, pipelines
  • Experience in large databases
  • Working knowledge of AWS (S3, KMS, and more) or Azure/GCP
  • Design, develop, and thoroughly test new ETL/ELT code, ensure accuracy, reliability, and adherence to best practices
  • Snowflake
  • Python/Spark/JavaScript
  • AWS/Azure/GCP
  • SQL

Good to have skills:-

  • CI/CD (DevOps)

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

Job ID: 106891583