Search by job, company or skills

Coding Macaw Bootcamp

Informatica ETL & Snowflake Data Engineer

Fresher
new job description bg glownew job description bg glownew job description bg svg
  • Posted 6 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Summary

Timing : Night shift ( 7:30PM IST 4AM IST - Monday to Friday )

Work Location : Remote

We are seeking a highly skilled Senior Informatica ETL & Snowflake Data Engineer with deep expertise in ETL development, cloud data engineering, and modern data warehousing. The ideal candidate will be proficient in Informatica, Snowflake, Python, AWS, and API integrations, with strong analytical and problem-solving capabilities. This role involves building scalable data pipelines, optimizing Snowflake environments, and supporting ongoing data migration and cloud initiatives.

Key Responsibilities

  • Design, develop, and optimize ETL/ELT pipelines using Informatica/IICS, Python, and Snowflake.
  • Build and maintain Snowflake data warehouses, including tasks, streams, dynamic tables, and performance tuning.
  • Develop and automate API integrations (JSON/XML) into structured datasets using Python.
  • Perform data migration from Oracle to Snowflake, ensuring accuracy, completeness, and performance optimization.
  • Utilize AWS services (S3, Lambda, EC2, DMS, etc.) to orchestrate cloud-based data workflows.
  • Create reusable, modular Python scripts for data ingestion, transformation, and cloud automation.
  • Conduct advanced data analysis to identify trends, patterns, and anomalies using SQL and Python.
  • Apply strong data modeling skills (ERD, star, snowflake schemas) to build scalable data solutions.
  • Collaborate with cross-functional teams to deliver high-quality, stable, and high-performing data pipelines.

Required Skills & Experience

  • Snowflake
  • Deep understanding of Snowflake architecture and warehouse components.
  • Experience with Dynamic Tables, Tasks, Streams.
  • Advanced SQL skills tailored for Snowflake.
  • Proven experience with performance tuning, query optimization, and troubleshooting.
  • Ability to design and optimize Snowflake data models and warehouses.

  • Oracle
  • Strong experience with Oracle SQL & PL/SQL.
  • Hands-on experience migrating data from Oracle to Snowflake.
  • Familiarity with Oracle performance tuning and backup strategies.

  • Python
  • Ability to build Python scripts integrating with AWS services.
  • Strong knowledge of Python data structures and OOP principles.
  • Ability to write clean, modular, reusable code.
  • Experience automating API data pulls.
  • Familiarity with AWS Lambda for serverless API integration.

  • API Integration
  • Experience pulling data from APIs into ETL/ELT pipelines.
  • Ability to transform JSON/XML into structured datasets (CSV, Parquet, tables).
  • Experience handling API authentication (API keys, OAuth2, JWT).

  • AWS
  • Hands-on experience with AWS services such as S3, EC2, DMS, Lambda.
  • Experience orchestrating cloud-based data pipelines.

  • Data Analysis
  • Strong SQL and Python data analysis skills.
  • Ability to derive insights, identify patterns, and troubleshoot issues using large datasets.

  • Data Modeling
  • Strong understanding of ER modeling and dimensional modeling (Star/Snowflake schemas).
  • Experience building scalable models for Snowflake environments.

ETL Tools

Required:

Informatica / IICS

  • Experience building ETL workflows and integrating them with Snowflake.

Nice to Have:

MuleSoft

  • Experience with API-based integrations and data workflows involving Snowflake, Oracle, and AWS.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 135323601