Search by job, company or skills

V

Snowflake

7-10 Years
Save
new job description bg glownew job description bg glow
  • Posted 7 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Data Pipeline Development

Design develop and maintain efficient reliable and reusable ETLELT pipelines using DBT Snowflake and related technologies

Data Modeling

Build and optimize data models star snowflake schemas to support analytical and reporting use cases ensuring data quality accuracy and performance

Data Integration

Integrate data from multiple sources APIs databases SaaS platforms cloud storage into Snowflake using modern data ingestion tools and frameworks

Performance Optimization

Tune SQL queries manage compute resources and optimize Snowflake performance for scalability and cost efficiency

Data Governance Quality

Implement data validation testing and monitoring processes to ensure high data quality and consistency across environments

Collaboration

Partner with data analysts data scientists and business teams to understand requirements define transformations and deliver highquality datasets

Automation CICD

Contribute to automation version control and deployment best practices using Git CICD pipelines and DBT Cloud or similar

Documentation

Maintain clear comprehensive documentation for pipelines data models and workflows

Required Skills Qualifications

Education Bachelors degree in computer science Information Technology Engineering or related field

Experience 710 years in data engineering with at least 3 years of handson experience in Snowflake and DBT

Technical Expertise

Strong SQL Skills And Experience In Building Complex Transformations

Proficiency in Snowflake architecture warehouses micropartitioning clustering query optimization etc

Deep understanding of DBT models macros testing documentation version control

Experience with cloud platforms such as AWS Azure or GCP eg S3 Lambda Cloud Storage

Familiarity with Python or Scala for data engineering workflows

Knowledge of orchestration tools such as Airflow Dagster or Prefect

Experience with Git CICD and deployment automation

Preferred Skills

Exposure to data warehousing concepts data lake architectures and realtime streaming Kafka Kinesis etc

Familiarity with BI tools such as Looker Power BI or Tableau

Knowledge of data governance security and compliance standards GDPR HIPAA etc

Experience working in an Agile environment

Key Attributes

Strong analytical and problemsolving skills

Excellent communication and collaboration abilities

Proactive mindset with attention to detail

Ability to work in fastpaced crossfunctional environments.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147469051

Similar Jobs

Pune, India

Skills:

DatabricksPythonSqlmedallion architectureDelta LakeGit-based version controlself-service data productsdata catalogsUnity Catalog

Remote

Skills:

snowflake Data WarehousingSqlData ArchitectureData ModelingCloud Data Platform

Pune, India

Skills:

SqlSnowflake architecturesecurity controlData Validationautomated data validation toolsELT processesdata governance frameworks

Pune, India

Skills:

snowflake GitData ModellingSnowpipeSqlAzure DevOpsELTEtlCloud Storage Integrationdbt

Pune, India

Skills:

snowflake KafkaSqlELTApache AirflowGitKinesisGcpData WarehousingAzurePythonAWSEtldbt