Search by job, company or skills

C

Informatica Developer

5-7 Years
10 - 23.5 LPA
new job description bg glownew job description bg glownew job description bg svg
  • Posted 6 hours ago
  • Be among the first 40 applicants
Early Applicant
Quick Apply

Job Description

Job Description: AWS Services & Informatica Developer

Responsibilities: This role is for NAM data engineering operations and Development.

Domains including: Data Integration, Data Extraction from Legacy systems, Data Warehousing, AWS Services, Python scripting, efficient in Extract/Load/Transform (ETL)workflows (Informatica PowerCenter and Informatica cloud services) and Redshift reporting.

Strong experience in design, development and testing of Informatica based applications (PowerCenter 10.2 and Informatica cloud services)

Should have Strong Job Description: AWS Services & Informatica Developer

Responsibilities: This role is for NAM data engineering operations and Development.

Domains including: Data Integration, Data Extraction from Legacy systems, Data Warehousing, AWS Services, Python scripting, efficient in Extract/Load/Transform (ETL)workflows (Informatica PowerCenter and Informatica cloud services) and Redshift reporting.

Strong experience in design, development and testing of Informatica based applications (PowerCenter 10.2 and Informatica cloud services)

Should have Strong experience of Oracle database, PL/SQL development, UNIX scripting.

Should have Strong experience of AWS Services like S3, Lambda, DynamoDB, DMS, Redshift,

Should have good experience on Python scripting

Should understand the overall system landscape, upstream and downstream systems

Excellent knowledge of debugging, tuning and optimizing performance of database queries

Good experience in Data Integration: Data Extraction from legacy systems and Load into AWS Redshift and Redshift spectrum.

Supports the module in production, resolves hot issues and implement and deploy enhancements to the application/package

Should be proficient in the end-to-end software development life cycle including: Requirement Analysis, Design, Development, Code Review, Testing

Responsible for ensuring defect-free and on-time delivery

Responsible for Issue resolution along with corrective and preventive measures.

Should be able to manage diverse set of Stakeholders and report on key

Qualification : BE Betech

Experience(in years) : 5

Skill Set Required : Aws Glue, DBT ,Redshift

These are the non-negotiables

AWS Glue as the ETL tool

Strong focus on Spark / PySpark

Not hardcore Spark internals, but solid hands-on usage

DBT (Data Build Tool)

ELT mindset

SQL + basic Python (no advanced Python required)

AWS fundamentals

S3

IAM roles & policies

Redshift

Architecture & data layer

Lakehouse architecture

Iceberg tables

Redshift as the primary querying / warehouse layer

Understanding how Glue + S3 + Iceberg + Redshift fit together is more important than deep theory

Nice-to-have / flexible areas

These are good if they've used it, totally fine if they haven't:

Airflow (as orchestration)

Even awareness or conceptual understanding is okay

CI/CD

GitHub

Basic exposure is enough; team support is available

of Oracle database, PL/SQL development, UNIX scripting.

Should have Strong knowledge of AWS Services like S3, Lambda, DynamoDB, DMS, Redshift,

Should have good knowledge on Python scripting

Should understand the overall system landscape, upstream and downstream systems

Excellent knowledge of debugging, tuning and optimizing performance of database queries

Good experience in Data Integration: Data Extraction from legacy systems and Load into AWS Redshift and Redshift spectrum.

Supports the module in production, resolves hot issues and implement and deploy enhancements to the application/package

Should be proficient in the end-to-end software development life cycle including: Requirement Analysis, Design, Development, Code Review, Testing

Responsible for ensuring defect-free and on-time delivery

Responsible for Issue resolution along with corrective and preventive measures.

Should be able to manage diverse set of Stakeholders and report on key

Qualification : BE Betech

Experience(in years) : 5

Skill Set Required : Aws Glue, DBT ,Redshift

These are the non-negotiables

AWS Glue as the ETL tool

Strong focus on Spark / PySpark

Not hardcore Spark internals, but solid hands-on usage

DBT (Data Build Tool)

ELT mindset

SQL + basic Python (no advanced Python required)

AWS fundamentals

S3

IAM roles & policies

Redshift

Architecture & data layer

Lakehouse architecture

Iceberg tables

Redshift as the primary querying / warehouse layer

Understanding how Glue + S3 + Iceberg + Redshift fit together is more important than deep theory

Nice-to-have / flexible areas

These are good if they've used it, totally fine if they haven't:

Airflow (as orchestration)

Even awareness or conceptual understanding is okay

CI/CD

GitHub

Basic exposure is enough; team support is available

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

Job ID: 145524969

Similar Jobs

Early Applicant